00:00:00.001 Started by upstream project "autotest-per-patch" build number 126181 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.007 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.008 The recommended git tool is: git 00:00:00.008 using credential 00000000-0000-0000-0000-000000000002 00:00:00.010 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.025 Fetching changes from the remote Git repository 00:00:00.030 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.047 Using shallow fetch with depth 1 00:00:00.047 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.047 > git --version # timeout=10 00:00:00.069 > git --version # 'git version 2.39.2' 00:00:00.069 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.102 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.102 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.241 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.253 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.266 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.266 > git config core.sparsecheckout # timeout=10 00:00:02.278 > git read-tree -mu HEAD # timeout=10 00:00:02.297 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.329 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.329 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.456 [Pipeline] Start of Pipeline 00:00:02.475 [Pipeline] library 00:00:02.477 Loading library shm_lib@master 00:00:02.477 Library shm_lib@master is cached. Copying from home. 00:00:02.493 [Pipeline] node 00:00:02.506 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.508 [Pipeline] { 00:00:02.517 [Pipeline] catchError 00:00:02.519 [Pipeline] { 00:00:02.529 [Pipeline] wrap 00:00:02.538 [Pipeline] { 00:00:02.549 [Pipeline] stage 00:00:02.551 [Pipeline] { (Prologue) 00:00:02.837 [Pipeline] sh 00:00:03.119 + logger -p user.info -t JENKINS-CI 00:00:03.137 [Pipeline] echo 00:00:03.138 Node: WFP50 00:00:03.146 [Pipeline] sh 00:00:03.440 [Pipeline] setCustomBuildProperty 00:00:03.453 [Pipeline] echo 00:00:03.455 Cleanup processes 00:00:03.461 [Pipeline] sh 00:00:03.738 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.738 1909833 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.747 [Pipeline] sh 00:00:04.024 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.025 ++ grep -v 'sudo pgrep' 00:00:04.025 ++ awk '{print $1}' 00:00:04.025 + sudo kill -9 00:00:04.025 + true 00:00:04.035 [Pipeline] cleanWs 00:00:04.042 [WS-CLEANUP] Deleting project workspace... 00:00:04.042 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.047 [WS-CLEANUP] done 00:00:04.051 [Pipeline] setCustomBuildProperty 00:00:04.061 [Pipeline] sh 00:00:04.336 + sudo git config --global --replace-all safe.directory '*' 00:00:04.411 [Pipeline] httpRequest 00:00:04.435 [Pipeline] echo 00:00:04.436 Sorcerer 10.211.164.101 is alive 00:00:04.445 [Pipeline] httpRequest 00:00:04.449 HttpMethod: GET 00:00:04.449 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.450 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.465 Response Code: HTTP/1.1 200 OK 00:00:04.466 Success: Status code 200 is in the accepted range: 200,404 00:00:04.466 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:08.780 [Pipeline] sh 00:00:09.062 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:09.338 [Pipeline] httpRequest 00:00:09.369 [Pipeline] echo 00:00:09.371 Sorcerer 10.211.164.101 is alive 00:00:09.378 [Pipeline] httpRequest 00:00:09.383 HttpMethod: GET 00:00:09.383 URL: http://10.211.164.101/packages/spdk_e7cce062d7bcec53f8a0237bb456695749792008.tar.gz 00:00:09.384 Sending request to url: http://10.211.164.101/packages/spdk_e7cce062d7bcec53f8a0237bb456695749792008.tar.gz 00:00:09.385 Response Code: HTTP/1.1 200 OK 00:00:09.386 Success: Status code 200 is in the accepted range: 200,404 00:00:09.386 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_e7cce062d7bcec53f8a0237bb456695749792008.tar.gz 00:00:29.405 [Pipeline] sh 00:00:29.688 + tar --no-same-owner -xf spdk_e7cce062d7bcec53f8a0237bb456695749792008.tar.gz 00:00:37.821 [Pipeline] sh 00:00:38.145 + git -C spdk log --oneline -n5 00:00:38.145 e7cce062d Examples/Perf: correct the calculation of total bandwidth 00:00:38.145 3b4b1d00c libvfio-user: bump MAX_DMA_REGIONS 00:00:38.145 32a79de81 lib/event: add disable_cpumask_locks to spdk_app_opts 00:00:38.145 719d03c6a sock/uring: only register net impl if supported 00:00:38.145 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:38.158 [Pipeline] } 00:00:38.175 [Pipeline] // stage 00:00:38.184 [Pipeline] stage 00:00:38.185 [Pipeline] { (Prepare) 00:00:38.201 [Pipeline] writeFile 00:00:38.218 [Pipeline] sh 00:00:38.499 + logger -p user.info -t JENKINS-CI 00:00:38.513 [Pipeline] sh 00:00:38.796 + logger -p user.info -t JENKINS-CI 00:00:38.811 [Pipeline] sh 00:00:39.095 + cat autorun-spdk.conf 00:00:39.095 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:39.095 SPDK_TEST_BLOCKDEV=1 00:00:39.095 SPDK_TEST_ISAL=1 00:00:39.095 SPDK_TEST_CRYPTO=1 00:00:39.095 SPDK_TEST_REDUCE=1 00:00:39.095 SPDK_TEST_VBDEV_COMPRESS=1 00:00:39.095 SPDK_RUN_UBSAN=1 00:00:39.103 RUN_NIGHTLY=0 00:00:39.110 [Pipeline] readFile 00:00:39.141 [Pipeline] withEnv 00:00:39.143 [Pipeline] { 00:00:39.159 [Pipeline] sh 00:00:39.446 + set -ex 00:00:39.446 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:39.446 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:39.446 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:39.446 ++ SPDK_TEST_BLOCKDEV=1 00:00:39.446 ++ SPDK_TEST_ISAL=1 00:00:39.446 ++ SPDK_TEST_CRYPTO=1 00:00:39.446 ++ SPDK_TEST_REDUCE=1 00:00:39.446 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:39.446 ++ SPDK_RUN_UBSAN=1 00:00:39.446 ++ RUN_NIGHTLY=0 00:00:39.446 + case $SPDK_TEST_NVMF_NICS in 00:00:39.446 + DRIVERS= 00:00:39.446 + [[ -n '' ]] 00:00:39.446 + exit 0 00:00:39.455 [Pipeline] } 00:00:39.469 [Pipeline] // withEnv 00:00:39.476 [Pipeline] } 00:00:39.497 [Pipeline] // stage 00:00:39.510 [Pipeline] catchError 00:00:39.512 [Pipeline] { 00:00:39.530 [Pipeline] timeout 00:00:39.530 Timeout set to expire in 40 min 00:00:39.533 [Pipeline] { 00:00:39.549 [Pipeline] stage 00:00:39.551 [Pipeline] { (Tests) 00:00:39.570 [Pipeline] sh 00:00:39.854 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:39.854 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:39.854 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:39.854 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:39.854 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:39.854 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:39.854 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:39.854 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:39.854 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:39.854 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:39.854 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:39.854 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:39.854 + source /etc/os-release 00:00:39.854 ++ NAME='Fedora Linux' 00:00:39.854 ++ VERSION='38 (Cloud Edition)' 00:00:39.854 ++ ID=fedora 00:00:39.854 ++ VERSION_ID=38 00:00:39.854 ++ VERSION_CODENAME= 00:00:39.854 ++ PLATFORM_ID=platform:f38 00:00:39.854 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:39.854 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:39.854 ++ LOGO=fedora-logo-icon 00:00:39.854 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:39.854 ++ HOME_URL=https://fedoraproject.org/ 00:00:39.854 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:39.854 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:39.854 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:39.854 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:39.854 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:39.854 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:39.854 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:39.854 ++ SUPPORT_END=2024-05-14 00:00:39.854 ++ VARIANT='Cloud Edition' 00:00:39.854 ++ VARIANT_ID=cloud 00:00:39.854 + uname -a 00:00:39.854 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:39.854 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:43.145 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:43.145 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:43.145 Hugepages 00:00:43.145 node hugesize free / total 00:00:43.145 node0 1048576kB 0 / 0 00:00:43.145 node0 2048kB 0 / 0 00:00:43.145 node1 1048576kB 0 / 0 00:00:43.145 node1 2048kB 0 / 0 00:00:43.145 00:00:43.145 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:43.145 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:43.145 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:43.145 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:43.145 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:43.145 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:43.145 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:00:43.145 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:00:43.145 + rm -f /tmp/spdk-ld-path 00:00:43.145 + source autorun-spdk.conf 00:00:43.145 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:43.145 ++ SPDK_TEST_BLOCKDEV=1 00:00:43.145 ++ SPDK_TEST_ISAL=1 00:00:43.145 ++ SPDK_TEST_CRYPTO=1 00:00:43.145 ++ SPDK_TEST_REDUCE=1 00:00:43.145 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:43.145 ++ SPDK_RUN_UBSAN=1 00:00:43.145 ++ RUN_NIGHTLY=0 00:00:43.145 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:43.145 + [[ -n '' ]] 00:00:43.145 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:43.145 + for M in /var/spdk/build-*-manifest.txt 00:00:43.145 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:43.145 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:43.145 + for M in /var/spdk/build-*-manifest.txt 00:00:43.145 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:43.145 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:43.145 ++ uname 00:00:43.145 + [[ Linux == \L\i\n\u\x ]] 00:00:43.145 + sudo dmesg -T 00:00:43.145 + sudo dmesg --clear 00:00:43.405 + dmesg_pid=1910805 00:00:43.405 + [[ Fedora Linux == FreeBSD ]] 00:00:43.405 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:43.405 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:43.405 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:43.405 + [[ -x /usr/src/fio-static/fio ]] 00:00:43.405 + export FIO_BIN=/usr/src/fio-static/fio 00:00:43.405 + FIO_BIN=/usr/src/fio-static/fio 00:00:43.405 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:43.405 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:43.405 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:43.405 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:43.405 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:43.405 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:43.405 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:43.405 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:43.405 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:43.405 + sudo dmesg -Tw 00:00:43.405 Test configuration: 00:00:43.405 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:43.405 SPDK_TEST_BLOCKDEV=1 00:00:43.405 SPDK_TEST_ISAL=1 00:00:43.405 SPDK_TEST_CRYPTO=1 00:00:43.405 SPDK_TEST_REDUCE=1 00:00:43.405 SPDK_TEST_VBDEV_COMPRESS=1 00:00:43.405 SPDK_RUN_UBSAN=1 00:00:43.405 RUN_NIGHTLY=0 13:19:22 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:43.405 13:19:22 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:43.405 13:19:22 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:43.405 13:19:22 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:43.405 13:19:22 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:43.405 13:19:22 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:43.405 13:19:22 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:43.406 13:19:22 -- paths/export.sh@5 -- $ export PATH 00:00:43.406 13:19:22 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:43.406 13:19:22 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:43.406 13:19:22 -- common/autobuild_common.sh@444 -- $ date +%s 00:00:43.406 13:19:22 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721042362.XXXXXX 00:00:43.406 13:19:22 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721042362.4rHAf9 00:00:43.406 13:19:22 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:00:43.406 13:19:22 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:00:43.406 13:19:22 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:43.406 13:19:22 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:43.406 13:19:22 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:43.406 13:19:22 -- common/autobuild_common.sh@460 -- $ get_config_params 00:00:43.406 13:19:22 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:43.406 13:19:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:43.406 13:19:22 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:43.406 13:19:22 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:00:43.406 13:19:22 -- pm/common@17 -- $ local monitor 00:00:43.406 13:19:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:43.406 13:19:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:43.406 13:19:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:43.406 13:19:22 -- pm/common@21 -- $ date +%s 00:00:43.406 13:19:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:43.406 13:19:22 -- pm/common@21 -- $ date +%s 00:00:43.406 13:19:22 -- pm/common@25 -- $ sleep 1 00:00:43.406 13:19:22 -- pm/common@21 -- $ date +%s 00:00:43.406 13:19:22 -- pm/common@21 -- $ date +%s 00:00:43.406 13:19:22 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042362 00:00:43.406 13:19:22 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042362 00:00:43.406 13:19:22 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042362 00:00:43.406 13:19:22 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721042362 00:00:43.406 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042362_collect-vmstat.pm.log 00:00:43.406 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042362_collect-cpu-load.pm.log 00:00:43.406 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042362_collect-cpu-temp.pm.log 00:00:43.406 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721042362_collect-bmc-pm.bmc.pm.log 00:00:44.341 13:19:23 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:00:44.341 13:19:23 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:44.341 13:19:23 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:44.341 13:19:23 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:44.341 13:19:23 -- spdk/autobuild.sh@16 -- $ date -u 00:00:44.341 Mon Jul 15 11:19:23 AM UTC 2024 00:00:44.341 13:19:23 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:44.341 v24.09-pre-205-ge7cce062d 00:00:44.599 13:19:23 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:44.599 13:19:23 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:44.599 13:19:23 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:44.599 13:19:23 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:44.599 13:19:23 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:44.599 13:19:23 -- common/autotest_common.sh@10 -- $ set +x 00:00:44.599 ************************************ 00:00:44.599 START TEST ubsan 00:00:44.599 ************************************ 00:00:44.599 13:19:23 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:44.599 using ubsan 00:00:44.599 00:00:44.599 real 0m0.001s 00:00:44.599 user 0m0.000s 00:00:44.599 sys 0m0.000s 00:00:44.599 13:19:23 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:44.599 13:19:23 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:44.599 ************************************ 00:00:44.599 END TEST ubsan 00:00:44.599 ************************************ 00:00:44.599 13:19:23 -- common/autotest_common.sh@1142 -- $ return 0 00:00:44.599 13:19:23 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:44.599 13:19:23 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:44.599 13:19:23 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:44.599 13:19:23 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:44.599 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:44.599 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:45.165 Using 'verbs' RDMA provider 00:01:01.417 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:13.629 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:13.888 Creating mk/config.mk...done. 00:01:13.888 Creating mk/cc.flags.mk...done. 00:01:13.888 Type 'make' to build. 00:01:13.888 13:19:53 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:13.888 13:19:53 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:13.888 13:19:53 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:13.888 13:19:53 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.888 ************************************ 00:01:13.888 START TEST make 00:01:13.888 ************************************ 00:01:13.888 13:19:53 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:14.454 make[1]: Nothing to be done for 'all'. 00:01:53.194 The Meson build system 00:01:53.194 Version: 1.3.1 00:01:53.194 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:53.194 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:53.194 Build type: native build 00:01:53.194 Program cat found: YES (/usr/bin/cat) 00:01:53.194 Project name: DPDK 00:01:53.194 Project version: 24.03.0 00:01:53.194 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:53.194 C linker for the host machine: cc ld.bfd 2.39-16 00:01:53.194 Host machine cpu family: x86_64 00:01:53.194 Host machine cpu: x86_64 00:01:53.194 Message: ## Building in Developer Mode ## 00:01:53.194 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:53.194 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:53.194 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:53.194 Program python3 found: YES (/usr/bin/python3) 00:01:53.194 Program cat found: YES (/usr/bin/cat) 00:01:53.194 Compiler for C supports arguments -march=native: YES 00:01:53.194 Checking for size of "void *" : 8 00:01:53.194 Checking for size of "void *" : 8 (cached) 00:01:53.194 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:53.194 Library m found: YES 00:01:53.194 Library numa found: YES 00:01:53.194 Has header "numaif.h" : YES 00:01:53.194 Library fdt found: NO 00:01:53.194 Library execinfo found: NO 00:01:53.194 Has header "execinfo.h" : YES 00:01:53.194 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:53.194 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:53.194 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:53.194 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:53.194 Run-time dependency openssl found: YES 3.0.9 00:01:53.194 Run-time dependency libpcap found: YES 1.10.4 00:01:53.194 Has header "pcap.h" with dependency libpcap: YES 00:01:53.194 Compiler for C supports arguments -Wcast-qual: YES 00:01:53.194 Compiler for C supports arguments -Wdeprecated: YES 00:01:53.194 Compiler for C supports arguments -Wformat: YES 00:01:53.194 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:53.194 Compiler for C supports arguments -Wformat-security: NO 00:01:53.194 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:53.194 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:53.194 Compiler for C supports arguments -Wnested-externs: YES 00:01:53.194 Compiler for C supports arguments -Wold-style-definition: YES 00:01:53.194 Compiler for C supports arguments -Wpointer-arith: YES 00:01:53.194 Compiler for C supports arguments -Wsign-compare: YES 00:01:53.194 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:53.194 Compiler for C supports arguments -Wundef: YES 00:01:53.194 Compiler for C supports arguments -Wwrite-strings: YES 00:01:53.194 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:53.194 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:53.194 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:53.194 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:53.194 Program objdump found: YES (/usr/bin/objdump) 00:01:53.194 Compiler for C supports arguments -mavx512f: YES 00:01:53.194 Checking if "AVX512 checking" compiles: YES 00:01:53.194 Fetching value of define "__SSE4_2__" : 1 00:01:53.194 Fetching value of define "__AES__" : 1 00:01:53.194 Fetching value of define "__AVX__" : 1 00:01:53.194 Fetching value of define "__AVX2__" : 1 00:01:53.194 Fetching value of define "__AVX512BW__" : 1 00:01:53.194 Fetching value of define "__AVX512CD__" : 1 00:01:53.194 Fetching value of define "__AVX512DQ__" : 1 00:01:53.194 Fetching value of define "__AVX512F__" : 1 00:01:53.194 Fetching value of define "__AVX512VL__" : 1 00:01:53.194 Fetching value of define "__PCLMUL__" : 1 00:01:53.194 Fetching value of define "__RDRND__" : 1 00:01:53.194 Fetching value of define "__RDSEED__" : 1 00:01:53.194 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:53.194 Fetching value of define "__znver1__" : (undefined) 00:01:53.194 Fetching value of define "__znver2__" : (undefined) 00:01:53.194 Fetching value of define "__znver3__" : (undefined) 00:01:53.194 Fetching value of define "__znver4__" : (undefined) 00:01:53.194 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:53.194 Message: lib/log: Defining dependency "log" 00:01:53.194 Message: lib/kvargs: Defining dependency "kvargs" 00:01:53.194 Message: lib/telemetry: Defining dependency "telemetry" 00:01:53.194 Checking for function "getentropy" : NO 00:01:53.194 Message: lib/eal: Defining dependency "eal" 00:01:53.194 Message: lib/ring: Defining dependency "ring" 00:01:53.194 Message: lib/rcu: Defining dependency "rcu" 00:01:53.194 Message: lib/mempool: Defining dependency "mempool" 00:01:53.194 Message: lib/mbuf: Defining dependency "mbuf" 00:01:53.194 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:53.194 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.194 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:53.194 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:53.194 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:53.194 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:53.194 Compiler for C supports arguments -mpclmul: YES 00:01:53.194 Compiler for C supports arguments -maes: YES 00:01:53.194 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:53.194 Compiler for C supports arguments -mavx512bw: YES 00:01:53.194 Compiler for C supports arguments -mavx512dq: YES 00:01:53.194 Compiler for C supports arguments -mavx512vl: YES 00:01:53.194 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:53.194 Compiler for C supports arguments -mavx2: YES 00:01:53.194 Compiler for C supports arguments -mavx: YES 00:01:53.194 Message: lib/net: Defining dependency "net" 00:01:53.194 Message: lib/meter: Defining dependency "meter" 00:01:53.194 Message: lib/ethdev: Defining dependency "ethdev" 00:01:53.194 Message: lib/pci: Defining dependency "pci" 00:01:53.194 Message: lib/cmdline: Defining dependency "cmdline" 00:01:53.194 Message: lib/hash: Defining dependency "hash" 00:01:53.194 Message: lib/timer: Defining dependency "timer" 00:01:53.194 Message: lib/compressdev: Defining dependency "compressdev" 00:01:53.194 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:53.194 Message: lib/dmadev: Defining dependency "dmadev" 00:01:53.194 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:53.194 Message: lib/power: Defining dependency "power" 00:01:53.194 Message: lib/reorder: Defining dependency "reorder" 00:01:53.194 Message: lib/security: Defining dependency "security" 00:01:53.194 Has header "linux/userfaultfd.h" : YES 00:01:53.194 Has header "linux/vduse.h" : YES 00:01:53.194 Message: lib/vhost: Defining dependency "vhost" 00:01:53.194 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:53.194 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:53.194 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:53.194 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:53.194 Compiler for C supports arguments -std=c11: YES 00:01:53.194 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:53.194 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:53.194 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:53.194 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:53.194 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:53.194 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:53.194 Library mtcr_ul found: NO 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:53.194 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:56.481 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:56.481 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:56.482 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:56.482 Configuring mlx5_autoconf.h using configuration 00:01:56.482 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:56.482 Run-time dependency libcrypto found: YES 3.0.9 00:01:56.482 Library IPSec_MB found: YES 00:01:56.482 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:56.482 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:56.482 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:56.482 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:56.482 Library IPSec_MB found: YES 00:01:56.482 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:56.482 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:56.482 Compiler for C supports arguments -std=c11: YES (cached) 00:01:56.482 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:56.482 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:56.482 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:56.482 Library libisal found: NO 00:01:56.482 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:56.482 Compiler for C supports arguments -std=c11: YES (cached) 00:01:56.482 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:56.482 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:56.482 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:56.482 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:56.482 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:56.482 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:56.482 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:56.482 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:56.482 Program doxygen found: YES (/usr/bin/doxygen) 00:01:56.482 Configuring doxy-api-html.conf using configuration 00:01:56.482 Configuring doxy-api-man.conf using configuration 00:01:56.482 Program mandb found: YES (/usr/bin/mandb) 00:01:56.482 Program sphinx-build found: NO 00:01:56.482 Configuring rte_build_config.h using configuration 00:01:56.482 Message: 00:01:56.482 ================= 00:01:56.482 Applications Enabled 00:01:56.482 ================= 00:01:56.482 00:01:56.482 apps: 00:01:56.482 00:01:56.482 00:01:56.482 Message: 00:01:56.482 ================= 00:01:56.482 Libraries Enabled 00:01:56.482 ================= 00:01:56.482 00:01:56.482 libs: 00:01:56.482 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:56.482 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:56.482 cryptodev, dmadev, power, reorder, security, vhost, 00:01:56.482 00:01:56.482 Message: 00:01:56.482 =============== 00:01:56.482 Drivers Enabled 00:01:56.482 =============== 00:01:56.482 00:01:56.482 common: 00:01:56.482 mlx5, qat, 00:01:56.482 bus: 00:01:56.482 auxiliary, pci, vdev, 00:01:56.482 mempool: 00:01:56.482 ring, 00:01:56.482 dma: 00:01:56.482 00:01:56.482 net: 00:01:56.482 00:01:56.482 crypto: 00:01:56.482 ipsec_mb, mlx5, 00:01:56.482 compress: 00:01:56.482 isal, mlx5, 00:01:56.482 vdpa: 00:01:56.482 00:01:56.482 00:01:56.482 Message: 00:01:56.482 ================= 00:01:56.482 Content Skipped 00:01:56.482 ================= 00:01:56.482 00:01:56.482 apps: 00:01:56.482 dumpcap: explicitly disabled via build config 00:01:56.482 graph: explicitly disabled via build config 00:01:56.482 pdump: explicitly disabled via build config 00:01:56.482 proc-info: explicitly disabled via build config 00:01:56.482 test-acl: explicitly disabled via build config 00:01:56.482 test-bbdev: explicitly disabled via build config 00:01:56.482 test-cmdline: explicitly disabled via build config 00:01:56.482 test-compress-perf: explicitly disabled via build config 00:01:56.482 test-crypto-perf: explicitly disabled via build config 00:01:56.482 test-dma-perf: explicitly disabled via build config 00:01:56.482 test-eventdev: explicitly disabled via build config 00:01:56.482 test-fib: explicitly disabled via build config 00:01:56.482 test-flow-perf: explicitly disabled via build config 00:01:56.482 test-gpudev: explicitly disabled via build config 00:01:56.482 test-mldev: explicitly disabled via build config 00:01:56.482 test-pipeline: explicitly disabled via build config 00:01:56.482 test-pmd: explicitly disabled via build config 00:01:56.482 test-regex: explicitly disabled via build config 00:01:56.482 test-sad: explicitly disabled via build config 00:01:56.482 test-security-perf: explicitly disabled via build config 00:01:56.482 00:01:56.482 libs: 00:01:56.482 argparse: explicitly disabled via build config 00:01:56.482 metrics: explicitly disabled via build config 00:01:56.482 acl: explicitly disabled via build config 00:01:56.482 bbdev: explicitly disabled via build config 00:01:56.482 bitratestats: explicitly disabled via build config 00:01:56.482 bpf: explicitly disabled via build config 00:01:56.482 cfgfile: explicitly disabled via build config 00:01:56.482 distributor: explicitly disabled via build config 00:01:56.482 efd: explicitly disabled via build config 00:01:56.482 eventdev: explicitly disabled via build config 00:01:56.482 dispatcher: explicitly disabled via build config 00:01:56.482 gpudev: explicitly disabled via build config 00:01:56.482 gro: explicitly disabled via build config 00:01:56.482 gso: explicitly disabled via build config 00:01:56.482 ip_frag: explicitly disabled via build config 00:01:56.482 jobstats: explicitly disabled via build config 00:01:56.482 latencystats: explicitly disabled via build config 00:01:56.482 lpm: explicitly disabled via build config 00:01:56.482 member: explicitly disabled via build config 00:01:56.482 pcapng: explicitly disabled via build config 00:01:56.482 rawdev: explicitly disabled via build config 00:01:56.482 regexdev: explicitly disabled via build config 00:01:56.482 mldev: explicitly disabled via build config 00:01:56.482 rib: explicitly disabled via build config 00:01:56.482 sched: explicitly disabled via build config 00:01:56.482 stack: explicitly disabled via build config 00:01:56.482 ipsec: explicitly disabled via build config 00:01:56.482 pdcp: explicitly disabled via build config 00:01:56.482 fib: explicitly disabled via build config 00:01:56.482 port: explicitly disabled via build config 00:01:56.482 pdump: explicitly disabled via build config 00:01:56.482 table: explicitly disabled via build config 00:01:56.482 pipeline: explicitly disabled via build config 00:01:56.482 graph: explicitly disabled via build config 00:01:56.482 node: explicitly disabled via build config 00:01:56.482 00:01:56.482 drivers: 00:01:56.482 common/cpt: not in enabled drivers build config 00:01:56.482 common/dpaax: not in enabled drivers build config 00:01:56.482 common/iavf: not in enabled drivers build config 00:01:56.482 common/idpf: not in enabled drivers build config 00:01:56.482 common/ionic: not in enabled drivers build config 00:01:56.482 common/mvep: not in enabled drivers build config 00:01:56.482 common/octeontx: not in enabled drivers build config 00:01:56.482 bus/cdx: not in enabled drivers build config 00:01:56.482 bus/dpaa: not in enabled drivers build config 00:01:56.482 bus/fslmc: not in enabled drivers build config 00:01:56.482 bus/ifpga: not in enabled drivers build config 00:01:56.482 bus/platform: not in enabled drivers build config 00:01:56.482 bus/uacce: not in enabled drivers build config 00:01:56.482 bus/vmbus: not in enabled drivers build config 00:01:56.482 common/cnxk: not in enabled drivers build config 00:01:56.482 common/nfp: not in enabled drivers build config 00:01:56.482 common/nitrox: not in enabled drivers build config 00:01:56.482 common/sfc_efx: not in enabled drivers build config 00:01:56.482 mempool/bucket: not in enabled drivers build config 00:01:56.482 mempool/cnxk: not in enabled drivers build config 00:01:56.482 mempool/dpaa: not in enabled drivers build config 00:01:56.482 mempool/dpaa2: not in enabled drivers build config 00:01:56.482 mempool/octeontx: not in enabled drivers build config 00:01:56.482 mempool/stack: not in enabled drivers build config 00:01:56.482 dma/cnxk: not in enabled drivers build config 00:01:56.482 dma/dpaa: not in enabled drivers build config 00:01:56.482 dma/dpaa2: not in enabled drivers build config 00:01:56.482 dma/hisilicon: not in enabled drivers build config 00:01:56.482 dma/idxd: not in enabled drivers build config 00:01:56.482 dma/ioat: not in enabled drivers build config 00:01:56.482 dma/skeleton: not in enabled drivers build config 00:01:56.482 net/af_packet: not in enabled drivers build config 00:01:56.482 net/af_xdp: not in enabled drivers build config 00:01:56.482 net/ark: not in enabled drivers build config 00:01:56.482 net/atlantic: not in enabled drivers build config 00:01:56.483 net/avp: not in enabled drivers build config 00:01:56.483 net/axgbe: not in enabled drivers build config 00:01:56.483 net/bnx2x: not in enabled drivers build config 00:01:56.483 net/bnxt: not in enabled drivers build config 00:01:56.483 net/bonding: not in enabled drivers build config 00:01:56.483 net/cnxk: not in enabled drivers build config 00:01:56.483 net/cpfl: not in enabled drivers build config 00:01:56.483 net/cxgbe: not in enabled drivers build config 00:01:56.483 net/dpaa: not in enabled drivers build config 00:01:56.483 net/dpaa2: not in enabled drivers build config 00:01:56.483 net/e1000: not in enabled drivers build config 00:01:56.483 net/ena: not in enabled drivers build config 00:01:56.483 net/enetc: not in enabled drivers build config 00:01:56.483 net/enetfec: not in enabled drivers build config 00:01:56.483 net/enic: not in enabled drivers build config 00:01:56.483 net/failsafe: not in enabled drivers build config 00:01:56.483 net/fm10k: not in enabled drivers build config 00:01:56.483 net/gve: not in enabled drivers build config 00:01:56.483 net/hinic: not in enabled drivers build config 00:01:56.483 net/hns3: not in enabled drivers build config 00:01:56.483 net/i40e: not in enabled drivers build config 00:01:56.483 net/iavf: not in enabled drivers build config 00:01:56.483 net/ice: not in enabled drivers build config 00:01:56.483 net/idpf: not in enabled drivers build config 00:01:56.483 net/igc: not in enabled drivers build config 00:01:56.483 net/ionic: not in enabled drivers build config 00:01:56.483 net/ipn3ke: not in enabled drivers build config 00:01:56.483 net/ixgbe: not in enabled drivers build config 00:01:56.483 net/mana: not in enabled drivers build config 00:01:56.483 net/memif: not in enabled drivers build config 00:01:56.483 net/mlx4: not in enabled drivers build config 00:01:56.483 net/mlx5: not in enabled drivers build config 00:01:56.483 net/mvneta: not in enabled drivers build config 00:01:56.483 net/mvpp2: not in enabled drivers build config 00:01:56.483 net/netvsc: not in enabled drivers build config 00:01:56.483 net/nfb: not in enabled drivers build config 00:01:56.483 net/nfp: not in enabled drivers build config 00:01:56.483 net/ngbe: not in enabled drivers build config 00:01:56.483 net/null: not in enabled drivers build config 00:01:56.483 net/octeontx: not in enabled drivers build config 00:01:56.483 net/octeon_ep: not in enabled drivers build config 00:01:56.483 net/pcap: not in enabled drivers build config 00:01:56.483 net/pfe: not in enabled drivers build config 00:01:56.483 net/qede: not in enabled drivers build config 00:01:56.483 net/ring: not in enabled drivers build config 00:01:56.483 net/sfc: not in enabled drivers build config 00:01:56.483 net/softnic: not in enabled drivers build config 00:01:56.483 net/tap: not in enabled drivers build config 00:01:56.483 net/thunderx: not in enabled drivers build config 00:01:56.483 net/txgbe: not in enabled drivers build config 00:01:56.483 net/vdev_netvsc: not in enabled drivers build config 00:01:56.483 net/vhost: not in enabled drivers build config 00:01:56.483 net/virtio: not in enabled drivers build config 00:01:56.483 net/vmxnet3: not in enabled drivers build config 00:01:56.483 raw/*: missing internal dependency, "rawdev" 00:01:56.483 crypto/armv8: not in enabled drivers build config 00:01:56.483 crypto/bcmfs: not in enabled drivers build config 00:01:56.483 crypto/caam_jr: not in enabled drivers build config 00:01:56.483 crypto/ccp: not in enabled drivers build config 00:01:56.483 crypto/cnxk: not in enabled drivers build config 00:01:56.483 crypto/dpaa_sec: not in enabled drivers build config 00:01:56.483 crypto/dpaa2_sec: not in enabled drivers build config 00:01:56.483 crypto/mvsam: not in enabled drivers build config 00:01:56.483 crypto/nitrox: not in enabled drivers build config 00:01:56.483 crypto/null: not in enabled drivers build config 00:01:56.483 crypto/octeontx: not in enabled drivers build config 00:01:56.483 crypto/openssl: not in enabled drivers build config 00:01:56.483 crypto/scheduler: not in enabled drivers build config 00:01:56.483 crypto/uadk: not in enabled drivers build config 00:01:56.483 crypto/virtio: not in enabled drivers build config 00:01:56.483 compress/nitrox: not in enabled drivers build config 00:01:56.483 compress/octeontx: not in enabled drivers build config 00:01:56.483 compress/zlib: not in enabled drivers build config 00:01:56.483 regex/*: missing internal dependency, "regexdev" 00:01:56.483 ml/*: missing internal dependency, "mldev" 00:01:56.483 vdpa/ifc: not in enabled drivers build config 00:01:56.483 vdpa/mlx5: not in enabled drivers build config 00:01:56.483 vdpa/nfp: not in enabled drivers build config 00:01:56.483 vdpa/sfc: not in enabled drivers build config 00:01:56.483 event/*: missing internal dependency, "eventdev" 00:01:56.483 baseband/*: missing internal dependency, "bbdev" 00:01:56.483 gpu/*: missing internal dependency, "gpudev" 00:01:56.483 00:01:56.483 00:01:56.483 Build targets in project: 115 00:01:56.483 00:01:56.483 DPDK 24.03.0 00:01:56.483 00:01:56.483 User defined options 00:01:56.483 buildtype : debug 00:01:56.483 default_library : shared 00:01:56.483 libdir : lib 00:01:56.483 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:56.483 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:56.483 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:56.483 cpu_instruction_set: native 00:01:56.483 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:01:56.483 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:01:56.483 enable_docs : false 00:01:56.483 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:56.483 enable_kmods : false 00:01:56.483 max_lcores : 128 00:01:56.483 tests : false 00:01:56.483 00:01:56.483 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:57.055 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:57.055 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:57.055 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:57.055 [3/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:57.055 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:57.055 [5/378] Linking static target lib/librte_kvargs.a 00:01:57.055 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:57.055 [7/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:57.055 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:57.055 [9/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:57.055 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:57.315 [11/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:57.315 [12/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:57.315 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:57.315 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:57.315 [15/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:57.315 [16/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:57.315 [17/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:57.315 [18/378] Linking static target lib/librte_log.a 00:01:57.315 [19/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:57.582 [20/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:57.582 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:57.582 [22/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.582 [23/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:57.582 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:57.582 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:57.582 [26/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:57.582 [27/378] Linking static target lib/librte_telemetry.a 00:01:57.582 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:57.582 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:57.582 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:57.582 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:57.582 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:57.582 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:57.582 [34/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:57.582 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:57.582 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:57.582 [37/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:57.582 [38/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:57.582 [39/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:57.582 [40/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:57.582 [41/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:57.840 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:57.840 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:57.840 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:57.840 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:57.840 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:57.840 [47/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:57.840 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:57.840 [49/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:57.840 [50/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:57.840 [51/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:57.840 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:57.840 [53/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:57.840 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:57.840 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:57.840 [56/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:57.840 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:57.840 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:57.840 [59/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:57.840 [60/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:57.840 [61/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:57.840 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:57.840 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:57.840 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:57.840 [65/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:57.840 [66/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:57.840 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:57.840 [68/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:57.840 [69/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:57.840 [70/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:57.840 [71/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:57.840 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:57.840 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:57.840 [74/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:57.840 [75/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:57.840 [76/378] Linking static target lib/librte_pci.a 00:01:57.840 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:57.840 [78/378] Linking static target lib/librte_ring.a 00:01:57.840 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:57.840 [80/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:57.840 [81/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:57.840 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:57.840 [83/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:57.840 [84/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:57.840 [85/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:57.840 [86/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:57.840 [87/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:57.840 [88/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:57.840 [89/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:57.840 [90/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:57.840 [91/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:57.840 [92/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:57.840 [93/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:57.840 [94/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:57.840 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:57.840 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:57.840 [97/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:58.102 [98/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:58.102 [99/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:58.102 [100/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:58.102 [101/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:58.102 [102/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:58.102 [103/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:58.102 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:58.102 [105/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.102 [106/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:58.102 [107/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:58.102 [108/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:58.102 [109/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:58.102 [110/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:58.102 [111/378] Linking target lib/librte_log.so.24.1 00:01:58.102 [112/378] Linking static target lib/librte_net.a 00:01:58.102 [113/378] Linking static target lib/librte_rcu.a 00:01:58.102 [114/378] Linking static target lib/librte_mempool.a 00:01:58.102 [115/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:58.102 [116/378] Linking static target lib/librte_meter.a 00:01:58.102 [117/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:58.102 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:58.362 [119/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.362 [120/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:58.362 [121/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:58.362 [122/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:58.362 [123/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.362 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:58.362 [125/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:58.362 [126/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:58.362 [127/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:58.362 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:58.362 [129/378] Linking target lib/librte_kvargs.so.24.1 00:01:58.362 [130/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:58.362 [131/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:58.362 [132/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:58.362 [133/378] Linking static target lib/librte_cmdline.a 00:01:58.362 [134/378] Linking static target lib/librte_mbuf.a 00:01:58.362 [135/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.362 [136/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:58.362 [137/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:58.362 [138/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:58.362 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:58.362 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:58.362 [141/378] Linking static target lib/librte_timer.a 00:01:58.362 [142/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:58.624 [143/378] Linking target lib/librte_telemetry.so.24.1 00:01:58.624 [144/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:58.624 [145/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:58.624 [146/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:58.624 [147/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:58.624 [148/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:58.624 [149/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:58.624 [150/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:58.624 [151/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.624 [152/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:58.624 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:58.624 [154/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:58.624 [155/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:58.624 [156/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:58.624 [157/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:58.624 [158/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:58.624 [159/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.624 [160/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:58.624 [161/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:58.624 [162/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:58.624 [163/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:58.624 [164/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:58.624 [165/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:58.624 [166/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:58.624 [167/378] Linking static target lib/librte_compressdev.a 00:01:58.624 [168/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:58.624 [169/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.624 [170/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:58.624 [171/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:58.624 [172/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:58.624 [173/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:58.624 [174/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:58.624 [175/378] Linking static target lib/librte_eal.a 00:01:58.624 [176/378] Linking static target lib/librte_dmadev.a 00:01:58.624 [177/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:58.624 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:58.624 [179/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:58.624 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:58.624 [181/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:58.887 [182/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:58.887 [183/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:58.887 [184/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:58.887 [185/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:58.887 [186/378] Linking static target lib/librte_power.a 00:01:58.887 [187/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:58.887 [188/378] Linking static target lib/librte_reorder.a 00:01:58.887 [189/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:58.887 [190/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:58.887 [191/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:58.887 [192/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:58.887 [193/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:58.887 [194/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:58.887 [195/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:58.887 [196/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:58.887 [197/378] Linking static target lib/librte_security.a 00:01:58.887 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:59.146 [199/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:59.146 [200/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:59.146 [201/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:59.146 [202/378] Linking static target drivers/librte_bus_auxiliary.a 00:01:59.146 [203/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:59.146 [204/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:59.146 [205/378] Linking static target lib/librte_hash.a 00:01:59.146 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:59.146 [207/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:59.147 [208/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:59.147 [209/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.147 [210/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:59.147 [211/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:59.147 [212/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:59.147 [213/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:59.147 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:59.147 [215/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:59.147 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:59.147 [217/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:59.147 [218/378] Linking static target drivers/librte_bus_vdev.a 00:01:59.147 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:59.147 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:59.147 [221/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.147 [222/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:59.147 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:59.147 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:59.471 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:59.471 [226/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:59.471 [227/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:59.471 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:59.471 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:59.471 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:01:59.471 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:59.471 [232/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:59.471 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:59.471 [234/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:59.471 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:01:59.471 [236/378] Linking static target drivers/librte_bus_pci.a 00:01:59.471 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:01:59.471 [238/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:59.471 [240/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:01:59.471 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:59.471 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:59.471 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:59.471 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:59.471 [246/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [247/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:59.471 [249/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [250/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:59.471 [251/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:59.471 [252/378] Linking static target lib/librte_cryptodev.a 00:01:59.471 [253/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:59.471 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:01:59.471 [255/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:59.471 [256/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:59.471 [257/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.471 [258/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:59.730 [259/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:59.730 [260/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:59.730 [261/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:59.730 [262/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.730 [263/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:59.730 [264/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:59.730 [265/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:59.730 [266/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:59.730 [267/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:59.730 [268/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:59.730 [269/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.730 [270/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:59.730 [271/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:59.730 [272/378] Linking static target lib/librte_ethdev.a 00:01:59.730 [273/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:59.730 [274/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:59.730 [275/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.730 [276/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:59.730 [277/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:59.730 [278/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:59.730 [279/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:59.730 [280/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:59.730 [281/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:59.989 [282/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:59.989 [283/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:59.989 [284/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:59.989 [285/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:59.989 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:59.989 [287/378] Linking static target drivers/librte_mempool_ring.a 00:01:59.989 [288/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:59.989 [289/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:59.989 [290/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:59.989 [291/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:59.989 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:59.989 [293/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:59.989 [294/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.989 [295/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:59.989 [296/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:59.989 [297/378] Linking static target drivers/librte_compress_mlx5.a 00:01:59.989 [298/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:59.989 [299/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:00.248 [300/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:00.248 [301/378] Linking static target drivers/librte_compress_isal.a 00:02:00.248 [302/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.248 [303/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:00.248 [304/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:00.248 [305/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:00.248 [306/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:00.248 [307/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:00.248 [308/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:00.248 [309/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:00.248 [310/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:00.248 [311/378] Linking static target drivers/librte_common_mlx5.a 00:02:00.506 [312/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:00.506 [313/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:00.764 [314/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:00.764 [315/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:01.023 [316/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:01.023 [317/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:01.023 [318/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:01.023 [319/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:01.023 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:01.281 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:01.281 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:01.281 [323/378] Linking static target drivers/librte_common_qat.a 00:02:01.540 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:01.540 [325/378] Linking static target lib/librte_vhost.a 00:02:01.540 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.074 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.605 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.134 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.663 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.663 [331/378] Linking target lib/librte_eal.so.24.1 00:02:11.663 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:11.921 [333/378] Linking target lib/librte_meter.so.24.1 00:02:11.921 [334/378] Linking target lib/librte_ring.so.24.1 00:02:11.921 [335/378] Linking target lib/librte_timer.so.24.1 00:02:11.921 [336/378] Linking target lib/librte_pci.so.24.1 00:02:11.921 [337/378] Linking target lib/librte_dmadev.so.24.1 00:02:11.921 [338/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:11.921 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:11.921 [340/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:11.921 [341/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:11.921 [342/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:11.921 [343/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:11.921 [344/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:11.921 [345/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:11.921 [346/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:12.179 [347/378] Linking target lib/librte_rcu.so.24.1 00:02:12.179 [348/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:12.179 [349/378] Linking target lib/librte_mempool.so.24.1 00:02:12.179 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:12.179 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:12.179 [352/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:12.179 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:12.179 [354/378] Linking target lib/librte_mbuf.so.24.1 00:02:12.438 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:12.438 [356/378] Linking target lib/librte_compressdev.so.24.1 00:02:12.438 [357/378] Linking target lib/librte_net.so.24.1 00:02:12.438 [358/378] Linking target lib/librte_cryptodev.so.24.1 00:02:12.438 [359/378] Linking target lib/librte_reorder.so.24.1 00:02:12.696 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:12.696 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:12.696 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:12.696 [363/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:12.696 [364/378] Linking target lib/librte_security.so.24.1 00:02:12.696 [365/378] Linking target lib/librte_hash.so.24.1 00:02:12.696 [366/378] Linking target lib/librte_cmdline.so.24.1 00:02:12.696 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:12.953 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:12.953 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:12.953 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:12.953 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:12.953 [372/378] Linking target lib/librte_power.so.24.1 00:02:12.953 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:13.212 [374/378] Linking target drivers/librte_common_qat.so.24.1 00:02:13.212 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:13.212 [376/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:13.212 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:13.212 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:13.212 INFO: autodetecting backend as ninja 00:02:13.212 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:14.586 CC lib/ut/ut.o 00:02:14.586 CC lib/ut_mock/mock.o 00:02:14.586 CC lib/log/log.o 00:02:14.586 CC lib/log/log_flags.o 00:02:14.586 CC lib/log/log_deprecated.o 00:02:14.844 LIB libspdk_ut.a 00:02:14.844 LIB libspdk_ut_mock.a 00:02:14.844 SO libspdk_ut.so.2.0 00:02:14.844 LIB libspdk_log.a 00:02:14.844 SO libspdk_ut_mock.so.6.0 00:02:14.844 SO libspdk_log.so.7.0 00:02:14.844 SYMLINK libspdk_ut.so 00:02:14.844 SYMLINK libspdk_ut_mock.so 00:02:14.844 SYMLINK libspdk_log.so 00:02:15.410 CC lib/dma/dma.o 00:02:15.410 CC lib/util/base64.o 00:02:15.410 CC lib/util/bit_array.o 00:02:15.410 CC lib/util/cpuset.o 00:02:15.410 CC lib/util/crc16.o 00:02:15.410 CC lib/util/crc32.o 00:02:15.410 CC lib/util/crc32c.o 00:02:15.410 CXX lib/trace_parser/trace.o 00:02:15.410 CC lib/util/crc32_ieee.o 00:02:15.410 CC lib/util/crc64.o 00:02:15.410 CC lib/util/dif.o 00:02:15.410 CC lib/util/fd.o 00:02:15.410 CC lib/util/file.o 00:02:15.410 CC lib/util/hexlify.o 00:02:15.410 CC lib/util/iov.o 00:02:15.410 CC lib/util/math.o 00:02:15.410 CC lib/util/pipe.o 00:02:15.410 CC lib/util/strerror_tls.o 00:02:15.410 CC lib/util/string.o 00:02:15.410 CC lib/util/uuid.o 00:02:15.410 CC lib/util/fd_group.o 00:02:15.410 CC lib/util/xor.o 00:02:15.410 CC lib/util/zipf.o 00:02:15.410 CC lib/ioat/ioat.o 00:02:15.410 CC lib/vfio_user/host/vfio_user_pci.o 00:02:15.410 CC lib/vfio_user/host/vfio_user.o 00:02:15.410 LIB libspdk_dma.a 00:02:15.668 SO libspdk_dma.so.4.0 00:02:15.668 LIB libspdk_ioat.a 00:02:15.668 SYMLINK libspdk_dma.so 00:02:15.668 SO libspdk_ioat.so.7.0 00:02:15.668 LIB libspdk_vfio_user.a 00:02:15.668 SO libspdk_vfio_user.so.5.0 00:02:15.668 SYMLINK libspdk_ioat.so 00:02:15.927 SYMLINK libspdk_vfio_user.so 00:02:15.927 LIB libspdk_util.a 00:02:15.927 SO libspdk_util.so.9.1 00:02:16.186 SYMLINK libspdk_util.so 00:02:16.186 LIB libspdk_trace_parser.a 00:02:16.186 SO libspdk_trace_parser.so.5.0 00:02:16.445 SYMLINK libspdk_trace_parser.so 00:02:16.445 CC lib/rdma_provider/common.o 00:02:16.445 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:16.445 CC lib/json/json_parse.o 00:02:16.445 CC lib/env_dpdk/env.o 00:02:16.445 CC lib/env_dpdk/memory.o 00:02:16.445 CC lib/json/json_write.o 00:02:16.445 CC lib/json/json_util.o 00:02:16.445 CC lib/env_dpdk/pci.o 00:02:16.445 CC lib/env_dpdk/init.o 00:02:16.445 CC lib/env_dpdk/threads.o 00:02:16.445 CC lib/env_dpdk/pci_virtio.o 00:02:16.445 CC lib/env_dpdk/pci_ioat.o 00:02:16.445 CC lib/env_dpdk/pci_vmd.o 00:02:16.445 CC lib/env_dpdk/pci_idxd.o 00:02:16.445 CC lib/conf/conf.o 00:02:16.445 CC lib/env_dpdk/pci_event.o 00:02:16.445 CC lib/env_dpdk/pci_dpdk.o 00:02:16.445 CC lib/env_dpdk/sigbus_handler.o 00:02:16.445 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:16.445 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:16.445 CC lib/reduce/reduce.o 00:02:16.445 CC lib/rdma_utils/rdma_utils.o 00:02:16.445 CC lib/vmd/vmd.o 00:02:16.445 CC lib/vmd/led.o 00:02:16.445 CC lib/idxd/idxd.o 00:02:16.445 CC lib/idxd/idxd_kernel.o 00:02:16.445 CC lib/idxd/idxd_user.o 00:02:16.704 LIB libspdk_rdma_provider.a 00:02:16.704 SO libspdk_rdma_provider.so.6.0 00:02:16.704 LIB libspdk_conf.a 00:02:16.704 SO libspdk_conf.so.6.0 00:02:16.704 LIB libspdk_json.a 00:02:16.704 SYMLINK libspdk_rdma_provider.so 00:02:16.704 LIB libspdk_rdma_utils.a 00:02:16.995 SYMLINK libspdk_conf.so 00:02:16.995 SO libspdk_json.so.6.0 00:02:16.995 SO libspdk_rdma_utils.so.1.0 00:02:16.995 SYMLINK libspdk_json.so 00:02:16.995 SYMLINK libspdk_rdma_utils.so 00:02:16.995 LIB libspdk_idxd.a 00:02:17.283 SO libspdk_idxd.so.12.0 00:02:17.283 LIB libspdk_reduce.a 00:02:17.283 LIB libspdk_vmd.a 00:02:17.283 SO libspdk_reduce.so.6.0 00:02:17.283 SYMLINK libspdk_idxd.so 00:02:17.283 SO libspdk_vmd.so.6.0 00:02:17.283 CC lib/jsonrpc/jsonrpc_server.o 00:02:17.283 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:17.283 CC lib/jsonrpc/jsonrpc_client.o 00:02:17.283 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:17.283 SYMLINK libspdk_reduce.so 00:02:17.283 SYMLINK libspdk_vmd.so 00:02:17.542 LIB libspdk_jsonrpc.a 00:02:17.542 SO libspdk_jsonrpc.so.6.0 00:02:17.800 SYMLINK libspdk_jsonrpc.so 00:02:17.800 LIB libspdk_env_dpdk.a 00:02:18.059 SO libspdk_env_dpdk.so.14.1 00:02:18.059 CC lib/rpc/rpc.o 00:02:18.059 SYMLINK libspdk_env_dpdk.so 00:02:18.317 LIB libspdk_rpc.a 00:02:18.317 SO libspdk_rpc.so.6.0 00:02:18.317 SYMLINK libspdk_rpc.so 00:02:18.576 CC lib/trace/trace.o 00:02:18.576 CC lib/trace/trace_flags.o 00:02:18.576 CC lib/trace/trace_rpc.o 00:02:18.576 CC lib/notify/notify.o 00:02:18.576 CC lib/notify/notify_rpc.o 00:02:18.835 CC lib/keyring/keyring_rpc.o 00:02:18.835 CC lib/keyring/keyring.o 00:02:18.835 LIB libspdk_notify.a 00:02:18.835 LIB libspdk_trace.a 00:02:18.835 SO libspdk_notify.so.6.0 00:02:18.835 SO libspdk_trace.so.10.0 00:02:19.094 LIB libspdk_keyring.a 00:02:19.094 SYMLINK libspdk_notify.so 00:02:19.094 SO libspdk_keyring.so.1.0 00:02:19.094 SYMLINK libspdk_trace.so 00:02:19.094 SYMLINK libspdk_keyring.so 00:02:19.353 CC lib/sock/sock.o 00:02:19.353 CC lib/sock/sock_rpc.o 00:02:19.353 CC lib/thread/thread.o 00:02:19.353 CC lib/thread/iobuf.o 00:02:19.922 LIB libspdk_sock.a 00:02:19.922 SO libspdk_sock.so.10.0 00:02:19.922 SYMLINK libspdk_sock.so 00:02:20.181 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:20.181 CC lib/nvme/nvme_ctrlr.o 00:02:20.181 CC lib/nvme/nvme_fabric.o 00:02:20.181 CC lib/nvme/nvme_ns_cmd.o 00:02:20.181 CC lib/nvme/nvme_ns.o 00:02:20.181 CC lib/nvme/nvme_pcie_common.o 00:02:20.181 CC lib/nvme/nvme_pcie.o 00:02:20.181 CC lib/nvme/nvme_qpair.o 00:02:20.181 CC lib/nvme/nvme.o 00:02:20.181 CC lib/nvme/nvme_quirks.o 00:02:20.181 CC lib/nvme/nvme_transport.o 00:02:20.181 CC lib/nvme/nvme_discovery.o 00:02:20.181 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:20.181 CC lib/nvme/nvme_tcp.o 00:02:20.181 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:20.181 CC lib/nvme/nvme_opal.o 00:02:20.181 CC lib/nvme/nvme_io_msg.o 00:02:20.181 CC lib/nvme/nvme_poll_group.o 00:02:20.181 CC lib/nvme/nvme_zns.o 00:02:20.181 CC lib/nvme/nvme_stubs.o 00:02:20.181 CC lib/nvme/nvme_auth.o 00:02:20.181 CC lib/nvme/nvme_rdma.o 00:02:20.181 CC lib/nvme/nvme_cuse.o 00:02:21.118 LIB libspdk_thread.a 00:02:21.118 SO libspdk_thread.so.10.1 00:02:21.118 SYMLINK libspdk_thread.so 00:02:21.376 CC lib/accel/accel.o 00:02:21.376 CC lib/accel/accel_rpc.o 00:02:21.376 CC lib/accel/accel_sw.o 00:02:21.376 CC lib/init/json_config.o 00:02:21.376 CC lib/init/subsystem.o 00:02:21.376 CC lib/init/subsystem_rpc.o 00:02:21.376 CC lib/init/rpc.o 00:02:21.376 CC lib/blob/zeroes.o 00:02:21.376 CC lib/blob/blobstore.o 00:02:21.376 CC lib/virtio/virtio.o 00:02:21.376 CC lib/blob/request.o 00:02:21.376 CC lib/virtio/virtio_vhost_user.o 00:02:21.376 CC lib/blob/blob_bs_dev.o 00:02:21.376 CC lib/virtio/virtio_vfio_user.o 00:02:21.376 CC lib/virtio/virtio_pci.o 00:02:21.635 LIB libspdk_init.a 00:02:21.635 SO libspdk_init.so.5.0 00:02:21.635 LIB libspdk_virtio.a 00:02:21.894 SO libspdk_virtio.so.7.0 00:02:21.894 SYMLINK libspdk_init.so 00:02:21.894 SYMLINK libspdk_virtio.so 00:02:22.152 CC lib/event/app.o 00:02:22.152 CC lib/event/app_rpc.o 00:02:22.152 CC lib/event/reactor.o 00:02:22.152 CC lib/event/log_rpc.o 00:02:22.152 CC lib/event/scheduler_static.o 00:02:22.409 LIB libspdk_accel.a 00:02:22.409 SO libspdk_accel.so.15.1 00:02:22.409 LIB libspdk_nvme.a 00:02:22.409 SYMLINK libspdk_accel.so 00:02:22.668 LIB libspdk_event.a 00:02:22.668 SO libspdk_nvme.so.13.1 00:02:22.668 SO libspdk_event.so.14.0 00:02:22.668 SYMLINK libspdk_event.so 00:02:22.926 CC lib/bdev/bdev.o 00:02:22.926 CC lib/bdev/bdev_rpc.o 00:02:22.926 CC lib/bdev/bdev_zone.o 00:02:22.926 CC lib/bdev/part.o 00:02:22.926 CC lib/bdev/scsi_nvme.o 00:02:22.926 SYMLINK libspdk_nvme.so 00:02:24.296 LIB libspdk_blob.a 00:02:24.296 SO libspdk_blob.so.11.0 00:02:24.552 SYMLINK libspdk_blob.so 00:02:24.809 CC lib/lvol/lvol.o 00:02:24.809 CC lib/blobfs/blobfs.o 00:02:24.809 CC lib/blobfs/tree.o 00:02:25.375 LIB libspdk_blobfs.a 00:02:25.375 LIB libspdk_bdev.a 00:02:25.375 SO libspdk_blobfs.so.10.0 00:02:25.633 SO libspdk_bdev.so.15.1 00:02:25.633 SYMLINK libspdk_blobfs.so 00:02:25.633 SYMLINK libspdk_bdev.so 00:02:25.891 LIB libspdk_lvol.a 00:02:25.891 SO libspdk_lvol.so.10.0 00:02:25.891 SYMLINK libspdk_lvol.so 00:02:25.891 CC lib/ublk/ublk.o 00:02:25.891 CC lib/ublk/ublk_rpc.o 00:02:25.891 CC lib/nvmf/ctrlr.o 00:02:25.891 CC lib/nvmf/ctrlr_discovery.o 00:02:25.891 CC lib/scsi/lun.o 00:02:25.891 CC lib/nvmf/ctrlr_bdev.o 00:02:25.891 CC lib/nvmf/subsystem.o 00:02:25.891 CC lib/scsi/dev.o 00:02:25.891 CC lib/nvmf/nvmf.o 00:02:25.891 CC lib/scsi/port.o 00:02:25.891 CC lib/nvmf/tcp.o 00:02:25.892 CC lib/nvmf/nvmf_rpc.o 00:02:25.892 CC lib/scsi/scsi.o 00:02:25.892 CC lib/nvmf/transport.o 00:02:25.892 CC lib/scsi/scsi_bdev.o 00:02:25.892 CC lib/nvmf/stubs.o 00:02:25.892 CC lib/scsi/scsi_pr.o 00:02:25.892 CC lib/nvmf/mdns_server.o 00:02:25.892 CC lib/scsi/scsi_rpc.o 00:02:25.892 CC lib/nvmf/rdma.o 00:02:25.892 CC lib/nvmf/auth.o 00:02:25.892 CC lib/scsi/task.o 00:02:25.892 CC lib/nbd/nbd.o 00:02:25.892 CC lib/ftl/ftl_core.o 00:02:25.892 CC lib/ftl/ftl_init.o 00:02:25.892 CC lib/nbd/nbd_rpc.o 00:02:25.892 CC lib/ftl/ftl_layout.o 00:02:25.892 CC lib/ftl/ftl_io.o 00:02:25.892 CC lib/ftl/ftl_debug.o 00:02:25.892 CC lib/ftl/ftl_sb.o 00:02:25.892 CC lib/ftl/ftl_l2p.o 00:02:25.892 CC lib/ftl/ftl_nv_cache.o 00:02:25.892 CC lib/ftl/ftl_l2p_flat.o 00:02:25.892 CC lib/ftl/ftl_band_ops.o 00:02:25.892 CC lib/ftl/ftl_band.o 00:02:25.892 CC lib/ftl/ftl_writer.o 00:02:25.892 CC lib/ftl/ftl_rq.o 00:02:25.892 CC lib/ftl/ftl_l2p_cache.o 00:02:25.892 CC lib/ftl/ftl_reloc.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt.o 00:02:25.892 CC lib/ftl/ftl_p2l.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:25.892 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:26.157 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:26.157 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:26.157 CC lib/ftl/utils/ftl_conf.o 00:02:26.157 CC lib/ftl/utils/ftl_mempool.o 00:02:26.157 CC lib/ftl/utils/ftl_md.o 00:02:26.157 CC lib/ftl/utils/ftl_bitmap.o 00:02:26.157 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:26.157 CC lib/ftl/utils/ftl_property.o 00:02:26.157 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:26.157 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:26.157 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:26.157 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:26.157 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:26.157 CC lib/ftl/base/ftl_base_dev.o 00:02:26.157 CC lib/ftl/base/ftl_base_bdev.o 00:02:26.157 CC lib/ftl/ftl_trace.o 00:02:26.725 LIB libspdk_nbd.a 00:02:26.725 LIB libspdk_ublk.a 00:02:26.725 LIB libspdk_scsi.a 00:02:26.725 SO libspdk_ublk.so.3.0 00:02:26.725 SO libspdk_nbd.so.7.0 00:02:26.725 SO libspdk_scsi.so.9.0 00:02:26.983 SYMLINK libspdk_ublk.so 00:02:26.983 SYMLINK libspdk_nbd.so 00:02:26.983 SYMLINK libspdk_scsi.so 00:02:27.241 CC lib/vhost/vhost_scsi.o 00:02:27.241 CC lib/vhost/vhost.o 00:02:27.241 CC lib/vhost/vhost_rpc.o 00:02:27.241 CC lib/vhost/vhost_blk.o 00:02:27.241 CC lib/vhost/rte_vhost_user.o 00:02:27.241 CC lib/iscsi/conn.o 00:02:27.241 CC lib/iscsi/init_grp.o 00:02:27.241 CC lib/iscsi/param.o 00:02:27.241 CC lib/iscsi/iscsi.o 00:02:27.241 LIB libspdk_ftl.a 00:02:27.241 CC lib/iscsi/md5.o 00:02:27.241 CC lib/iscsi/portal_grp.o 00:02:27.241 CC lib/iscsi/tgt_node.o 00:02:27.241 CC lib/iscsi/iscsi_subsystem.o 00:02:27.241 CC lib/iscsi/iscsi_rpc.o 00:02:27.241 CC lib/iscsi/task.o 00:02:27.499 SO libspdk_ftl.so.9.0 00:02:27.756 SYMLINK libspdk_ftl.so 00:02:28.321 LIB libspdk_nvmf.a 00:02:28.321 SO libspdk_nvmf.so.18.1 00:02:28.580 LIB libspdk_iscsi.a 00:02:28.580 SYMLINK libspdk_nvmf.so 00:02:28.838 SO libspdk_iscsi.so.8.0 00:02:28.838 SYMLINK libspdk_iscsi.so 00:02:29.405 LIB libspdk_vhost.a 00:02:29.405 SO libspdk_vhost.so.8.0 00:02:29.405 SYMLINK libspdk_vhost.so 00:02:29.974 CC module/env_dpdk/env_dpdk_rpc.o 00:02:30.233 LIB libspdk_env_dpdk_rpc.a 00:02:30.233 CC module/blob/bdev/blob_bdev.o 00:02:30.233 CC module/scheduler/gscheduler/gscheduler.o 00:02:30.233 CC module/accel/error/accel_error.o 00:02:30.233 CC module/accel/error/accel_error_rpc.o 00:02:30.233 CC module/accel/iaa/accel_iaa.o 00:02:30.233 CC module/accel/iaa/accel_iaa_rpc.o 00:02:30.233 SO libspdk_env_dpdk_rpc.so.6.0 00:02:30.233 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:30.233 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:30.233 CC module/keyring/linux/keyring.o 00:02:30.233 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:30.233 CC module/keyring/linux/keyring_rpc.o 00:02:30.233 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:30.233 CC module/accel/ioat/accel_ioat.o 00:02:30.233 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:30.233 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:30.233 CC module/accel/ioat/accel_ioat_rpc.o 00:02:30.233 CC module/keyring/file/keyring.o 00:02:30.233 CC module/keyring/file/keyring_rpc.o 00:02:30.233 CC module/accel/dsa/accel_dsa.o 00:02:30.233 CC module/accel/dsa/accel_dsa_rpc.o 00:02:30.233 CC module/sock/posix/posix.o 00:02:30.233 SYMLINK libspdk_env_dpdk_rpc.so 00:02:30.492 LIB libspdk_scheduler_gscheduler.a 00:02:30.492 LIB libspdk_accel_error.a 00:02:30.492 SO libspdk_scheduler_gscheduler.so.4.0 00:02:30.492 LIB libspdk_keyring_file.a 00:02:30.492 LIB libspdk_keyring_linux.a 00:02:30.492 SO libspdk_accel_error.so.2.0 00:02:30.492 LIB libspdk_scheduler_dpdk_governor.a 00:02:30.492 SO libspdk_keyring_file.so.1.0 00:02:30.492 LIB libspdk_accel_iaa.a 00:02:30.492 SYMLINK libspdk_scheduler_gscheduler.so 00:02:30.492 LIB libspdk_scheduler_dynamic.a 00:02:30.492 SO libspdk_keyring_linux.so.1.0 00:02:30.492 LIB libspdk_accel_ioat.a 00:02:30.492 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:30.492 SYMLINK libspdk_accel_error.so 00:02:30.492 SO libspdk_accel_iaa.so.3.0 00:02:30.492 LIB libspdk_blob_bdev.a 00:02:30.492 SO libspdk_scheduler_dynamic.so.4.0 00:02:30.492 SO libspdk_accel_ioat.so.6.0 00:02:30.492 LIB libspdk_accel_dsa.a 00:02:30.492 SYMLINK libspdk_keyring_file.so 00:02:30.492 SO libspdk_blob_bdev.so.11.0 00:02:30.492 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:30.492 SYMLINK libspdk_keyring_linux.so 00:02:30.492 SO libspdk_accel_dsa.so.5.0 00:02:30.492 SYMLINK libspdk_accel_iaa.so 00:02:30.492 SYMLINK libspdk_scheduler_dynamic.so 00:02:30.492 SYMLINK libspdk_accel_ioat.so 00:02:30.492 SYMLINK libspdk_blob_bdev.so 00:02:30.751 SYMLINK libspdk_accel_dsa.so 00:02:31.010 LIB libspdk_sock_posix.a 00:02:31.010 SO libspdk_sock_posix.so.6.0 00:02:31.010 CC module/bdev/split/vbdev_split.o 00:02:31.010 CC module/bdev/split/vbdev_split_rpc.o 00:02:31.010 CC module/bdev/iscsi/bdev_iscsi.o 00:02:31.010 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:31.010 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:31.010 CC module/bdev/malloc/bdev_malloc.o 00:02:31.010 CC module/bdev/ftl/bdev_ftl.o 00:02:31.010 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:31.010 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:31.010 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:31.010 CC module/bdev/compress/vbdev_compress.o 00:02:31.010 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:31.010 CC module/bdev/nvme/bdev_nvme.o 00:02:31.010 CC module/bdev/lvol/vbdev_lvol.o 00:02:31.010 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:31.010 CC module/bdev/nvme/nvme_rpc.o 00:02:31.010 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:31.010 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:31.010 CC module/bdev/nvme/vbdev_opal.o 00:02:31.010 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:31.010 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:31.010 CC module/bdev/nvme/bdev_mdns_client.o 00:02:31.010 CC module/bdev/passthru/vbdev_passthru.o 00:02:31.010 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:31.010 CC module/bdev/delay/vbdev_delay.o 00:02:31.010 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:31.010 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:31.010 CC module/bdev/aio/bdev_aio_rpc.o 00:02:31.010 CC module/bdev/aio/bdev_aio.o 00:02:31.010 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:31.010 CC module/bdev/raid/bdev_raid.o 00:02:31.010 CC module/bdev/raid/bdev_raid_sb.o 00:02:31.010 CC module/bdev/gpt/gpt.o 00:02:31.010 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:31.010 CC module/blobfs/bdev/blobfs_bdev.o 00:02:31.010 CC module/bdev/gpt/vbdev_gpt.o 00:02:31.010 CC module/bdev/raid/raid0.o 00:02:31.010 CC module/bdev/raid/raid1.o 00:02:31.010 CC module/bdev/raid/concat.o 00:02:31.010 CC module/bdev/raid/bdev_raid_rpc.o 00:02:31.010 CC module/bdev/error/vbdev_error.o 00:02:31.010 CC module/bdev/error/vbdev_error_rpc.o 00:02:31.010 CC module/bdev/crypto/vbdev_crypto.o 00:02:31.010 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:31.010 CC module/bdev/null/bdev_null.o 00:02:31.010 CC module/bdev/null/bdev_null_rpc.o 00:02:31.269 SYMLINK libspdk_sock_posix.so 00:02:31.269 LIB libspdk_accel_dpdk_cryptodev.a 00:02:31.269 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:31.527 LIB libspdk_bdev_passthru.a 00:02:31.527 LIB libspdk_blobfs_bdev.a 00:02:31.527 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:31.527 SO libspdk_bdev_passthru.so.6.0 00:02:31.527 SO libspdk_blobfs_bdev.so.6.0 00:02:31.527 LIB libspdk_accel_dpdk_compressdev.a 00:02:31.527 LIB libspdk_bdev_ftl.a 00:02:31.527 LIB libspdk_bdev_split.a 00:02:31.527 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:31.527 SO libspdk_bdev_ftl.so.6.0 00:02:31.527 LIB libspdk_bdev_iscsi.a 00:02:31.527 SO libspdk_bdev_split.so.6.0 00:02:31.527 LIB libspdk_bdev_zone_block.a 00:02:31.527 SYMLINK libspdk_bdev_passthru.so 00:02:31.527 LIB libspdk_bdev_error.a 00:02:31.527 SYMLINK libspdk_blobfs_bdev.so 00:02:31.527 SO libspdk_bdev_iscsi.so.6.0 00:02:31.527 LIB libspdk_bdev_compress.a 00:02:31.527 SO libspdk_bdev_zone_block.so.6.0 00:02:31.528 LIB libspdk_bdev_null.a 00:02:31.528 SO libspdk_bdev_error.so.6.0 00:02:31.528 LIB libspdk_bdev_gpt.a 00:02:31.528 SYMLINK libspdk_bdev_split.so 00:02:31.528 SYMLINK libspdk_bdev_ftl.so 00:02:31.528 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:31.528 SO libspdk_bdev_compress.so.6.0 00:02:31.528 SO libspdk_bdev_null.so.6.0 00:02:31.528 SYMLINK libspdk_bdev_zone_block.so 00:02:31.528 SO libspdk_bdev_gpt.so.6.0 00:02:31.528 SYMLINK libspdk_bdev_iscsi.so 00:02:31.528 SYMLINK libspdk_bdev_error.so 00:02:31.528 LIB libspdk_bdev_crypto.a 00:02:31.528 LIB libspdk_bdev_malloc.a 00:02:31.528 LIB libspdk_bdev_aio.a 00:02:31.528 SYMLINK libspdk_bdev_compress.so 00:02:31.787 SYMLINK libspdk_bdev_null.so 00:02:31.787 SYMLINK libspdk_bdev_gpt.so 00:02:31.787 SO libspdk_bdev_crypto.so.6.0 00:02:31.787 SO libspdk_bdev_malloc.so.6.0 00:02:31.787 SO libspdk_bdev_aio.so.6.0 00:02:31.787 LIB libspdk_bdev_virtio.a 00:02:31.787 SO libspdk_bdev_virtio.so.6.0 00:02:31.787 SYMLINK libspdk_bdev_crypto.so 00:02:31.787 SYMLINK libspdk_bdev_malloc.so 00:02:31.787 LIB libspdk_bdev_lvol.a 00:02:31.787 SYMLINK libspdk_bdev_aio.so 00:02:31.787 SO libspdk_bdev_lvol.so.6.0 00:02:31.787 SYMLINK libspdk_bdev_virtio.so 00:02:31.787 SYMLINK libspdk_bdev_lvol.so 00:02:32.093 LIB libspdk_bdev_delay.a 00:02:32.093 SO libspdk_bdev_delay.so.6.0 00:02:32.093 SYMLINK libspdk_bdev_delay.so 00:02:32.093 LIB libspdk_bdev_raid.a 00:02:32.093 SO libspdk_bdev_raid.so.6.0 00:02:32.352 SYMLINK libspdk_bdev_raid.so 00:02:33.728 LIB libspdk_bdev_nvme.a 00:02:33.728 SO libspdk_bdev_nvme.so.7.0 00:02:33.728 SYMLINK libspdk_bdev_nvme.so 00:02:34.663 CC module/event/subsystems/sock/sock.o 00:02:34.663 CC module/event/subsystems/scheduler/scheduler.o 00:02:34.663 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:34.663 CC module/event/subsystems/keyring/keyring.o 00:02:34.663 CC module/event/subsystems/vmd/vmd.o 00:02:34.663 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:34.663 CC module/event/subsystems/iobuf/iobuf.o 00:02:34.663 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:34.663 LIB libspdk_event_keyring.a 00:02:34.663 LIB libspdk_event_vhost_blk.a 00:02:34.663 LIB libspdk_event_sock.a 00:02:34.663 LIB libspdk_event_scheduler.a 00:02:34.663 LIB libspdk_event_vmd.a 00:02:34.663 SO libspdk_event_keyring.so.1.0 00:02:34.663 SO libspdk_event_vhost_blk.so.3.0 00:02:34.663 LIB libspdk_event_iobuf.a 00:02:34.663 SO libspdk_event_sock.so.5.0 00:02:34.663 SO libspdk_event_scheduler.so.4.0 00:02:34.663 SO libspdk_event_vmd.so.6.0 00:02:34.663 SO libspdk_event_iobuf.so.3.0 00:02:34.663 SYMLINK libspdk_event_keyring.so 00:02:34.663 SYMLINK libspdk_event_vhost_blk.so 00:02:34.663 SYMLINK libspdk_event_sock.so 00:02:34.663 SYMLINK libspdk_event_scheduler.so 00:02:34.663 SYMLINK libspdk_event_vmd.so 00:02:34.922 SYMLINK libspdk_event_iobuf.so 00:02:35.180 CC module/event/subsystems/accel/accel.o 00:02:35.439 LIB libspdk_event_accel.a 00:02:35.439 SO libspdk_event_accel.so.6.0 00:02:35.439 SYMLINK libspdk_event_accel.so 00:02:35.698 CC module/event/subsystems/bdev/bdev.o 00:02:35.957 LIB libspdk_event_bdev.a 00:02:35.957 SO libspdk_event_bdev.so.6.0 00:02:35.957 SYMLINK libspdk_event_bdev.so 00:02:36.523 CC module/event/subsystems/scsi/scsi.o 00:02:36.523 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:36.523 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:36.523 CC module/event/subsystems/ublk/ublk.o 00:02:36.523 CC module/event/subsystems/nbd/nbd.o 00:02:36.523 LIB libspdk_event_nbd.a 00:02:36.524 LIB libspdk_event_ublk.a 00:02:36.524 LIB libspdk_event_scsi.a 00:02:36.524 SO libspdk_event_nbd.so.6.0 00:02:36.524 LIB libspdk_event_nvmf.a 00:02:36.524 SO libspdk_event_ublk.so.3.0 00:02:36.524 SO libspdk_event_scsi.so.6.0 00:02:36.524 SO libspdk_event_nvmf.so.6.0 00:02:36.782 SYMLINK libspdk_event_nbd.so 00:02:36.782 SYMLINK libspdk_event_ublk.so 00:02:36.782 SYMLINK libspdk_event_scsi.so 00:02:36.782 SYMLINK libspdk_event_nvmf.so 00:02:37.041 CC module/event/subsystems/iscsi/iscsi.o 00:02:37.041 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:37.300 LIB libspdk_event_vhost_scsi.a 00:02:37.300 SO libspdk_event_vhost_scsi.so.3.0 00:02:37.300 LIB libspdk_event_iscsi.a 00:02:37.300 SO libspdk_event_iscsi.so.6.0 00:02:37.300 SYMLINK libspdk_event_vhost_scsi.so 00:02:37.300 SYMLINK libspdk_event_iscsi.so 00:02:37.559 SO libspdk.so.6.0 00:02:37.559 SYMLINK libspdk.so 00:02:38.138 CC test/rpc_client/rpc_client_test.o 00:02:38.138 TEST_HEADER include/spdk/accel.h 00:02:38.138 TEST_HEADER include/spdk/assert.h 00:02:38.138 TEST_HEADER include/spdk/accel_module.h 00:02:38.138 TEST_HEADER include/spdk/base64.h 00:02:38.138 TEST_HEADER include/spdk/barrier.h 00:02:38.138 TEST_HEADER include/spdk/bdev.h 00:02:38.138 TEST_HEADER include/spdk/bdev_module.h 00:02:38.138 CXX app/trace/trace.o 00:02:38.139 TEST_HEADER include/spdk/bit_pool.h 00:02:38.139 TEST_HEADER include/spdk/bdev_zone.h 00:02:38.139 TEST_HEADER include/spdk/bit_array.h 00:02:38.139 TEST_HEADER include/spdk/blob_bdev.h 00:02:38.139 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:38.139 TEST_HEADER include/spdk/blob.h 00:02:38.139 TEST_HEADER include/spdk/blobfs.h 00:02:38.139 TEST_HEADER include/spdk/config.h 00:02:38.139 TEST_HEADER include/spdk/conf.h 00:02:38.139 TEST_HEADER include/spdk/crc16.h 00:02:38.139 TEST_HEADER include/spdk/crc32.h 00:02:38.139 TEST_HEADER include/spdk/cpuset.h 00:02:38.139 CC app/trace_record/trace_record.o 00:02:38.139 TEST_HEADER include/spdk/crc64.h 00:02:38.139 TEST_HEADER include/spdk/dif.h 00:02:38.139 TEST_HEADER include/spdk/dma.h 00:02:38.139 CC app/spdk_nvme_identify/identify.o 00:02:38.139 TEST_HEADER include/spdk/env_dpdk.h 00:02:38.139 TEST_HEADER include/spdk/endian.h 00:02:38.139 TEST_HEADER include/spdk/env.h 00:02:38.139 TEST_HEADER include/spdk/event.h 00:02:38.139 TEST_HEADER include/spdk/fd.h 00:02:38.139 TEST_HEADER include/spdk/file.h 00:02:38.139 TEST_HEADER include/spdk/fd_group.h 00:02:38.139 TEST_HEADER include/spdk/ftl.h 00:02:38.139 CC app/spdk_nvme_discover/discovery_aer.o 00:02:38.139 TEST_HEADER include/spdk/gpt_spec.h 00:02:38.139 TEST_HEADER include/spdk/hexlify.h 00:02:38.139 TEST_HEADER include/spdk/histogram_data.h 00:02:38.139 TEST_HEADER include/spdk/idxd.h 00:02:38.139 TEST_HEADER include/spdk/idxd_spec.h 00:02:38.139 CC app/spdk_nvme_perf/perf.o 00:02:38.139 TEST_HEADER include/spdk/init.h 00:02:38.139 TEST_HEADER include/spdk/ioat.h 00:02:38.139 TEST_HEADER include/spdk/ioat_spec.h 00:02:38.139 TEST_HEADER include/spdk/json.h 00:02:38.139 TEST_HEADER include/spdk/iscsi_spec.h 00:02:38.139 CC app/spdk_top/spdk_top.o 00:02:38.139 TEST_HEADER include/spdk/jsonrpc.h 00:02:38.139 TEST_HEADER include/spdk/keyring.h 00:02:38.139 TEST_HEADER include/spdk/likely.h 00:02:38.139 TEST_HEADER include/spdk/keyring_module.h 00:02:38.139 TEST_HEADER include/spdk/log.h 00:02:38.139 TEST_HEADER include/spdk/memory.h 00:02:38.139 TEST_HEADER include/spdk/mmio.h 00:02:38.139 TEST_HEADER include/spdk/lvol.h 00:02:38.139 TEST_HEADER include/spdk/nbd.h 00:02:38.139 CC app/spdk_lspci/spdk_lspci.o 00:02:38.139 TEST_HEADER include/spdk/nvme.h 00:02:38.139 TEST_HEADER include/spdk/notify.h 00:02:38.139 TEST_HEADER include/spdk/nvme_intel.h 00:02:38.139 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:38.139 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:38.139 TEST_HEADER include/spdk/nvme_spec.h 00:02:38.139 TEST_HEADER include/spdk/nvme_zns.h 00:02:38.139 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:38.139 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:38.139 TEST_HEADER include/spdk/nvmf.h 00:02:38.139 TEST_HEADER include/spdk/nvmf_spec.h 00:02:38.139 TEST_HEADER include/spdk/nvmf_transport.h 00:02:38.139 TEST_HEADER include/spdk/opal.h 00:02:38.139 TEST_HEADER include/spdk/opal_spec.h 00:02:38.139 TEST_HEADER include/spdk/pci_ids.h 00:02:38.139 TEST_HEADER include/spdk/pipe.h 00:02:38.139 TEST_HEADER include/spdk/queue.h 00:02:38.139 TEST_HEADER include/spdk/reduce.h 00:02:38.139 TEST_HEADER include/spdk/rpc.h 00:02:38.139 TEST_HEADER include/spdk/scheduler.h 00:02:38.139 TEST_HEADER include/spdk/scsi_spec.h 00:02:38.139 TEST_HEADER include/spdk/scsi.h 00:02:38.139 TEST_HEADER include/spdk/sock.h 00:02:38.139 TEST_HEADER include/spdk/stdinc.h 00:02:38.139 TEST_HEADER include/spdk/string.h 00:02:38.139 TEST_HEADER include/spdk/thread.h 00:02:38.139 TEST_HEADER include/spdk/trace.h 00:02:38.139 TEST_HEADER include/spdk/trace_parser.h 00:02:38.139 TEST_HEADER include/spdk/tree.h 00:02:38.139 TEST_HEADER include/spdk/ublk.h 00:02:38.139 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:38.139 TEST_HEADER include/spdk/util.h 00:02:38.139 TEST_HEADER include/spdk/uuid.h 00:02:38.139 TEST_HEADER include/spdk/version.h 00:02:38.139 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:38.139 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:38.139 TEST_HEADER include/spdk/vhost.h 00:02:38.139 TEST_HEADER include/spdk/vmd.h 00:02:38.139 CC app/spdk_dd/spdk_dd.o 00:02:38.139 TEST_HEADER include/spdk/xor.h 00:02:38.139 CC app/nvmf_tgt/nvmf_main.o 00:02:38.139 TEST_HEADER include/spdk/zipf.h 00:02:38.139 CXX test/cpp_headers/accel.o 00:02:38.139 CXX test/cpp_headers/accel_module.o 00:02:38.139 CXX test/cpp_headers/assert.o 00:02:38.139 CXX test/cpp_headers/barrier.o 00:02:38.139 CXX test/cpp_headers/base64.o 00:02:38.139 CXX test/cpp_headers/bdev.o 00:02:38.139 CXX test/cpp_headers/bdev_module.o 00:02:38.139 CXX test/cpp_headers/bdev_zone.o 00:02:38.139 CXX test/cpp_headers/bit_array.o 00:02:38.139 CXX test/cpp_headers/bit_pool.o 00:02:38.139 CXX test/cpp_headers/blob_bdev.o 00:02:38.139 CXX test/cpp_headers/blobfs_bdev.o 00:02:38.139 CXX test/cpp_headers/blobfs.o 00:02:38.139 CXX test/cpp_headers/conf.o 00:02:38.139 CXX test/cpp_headers/config.o 00:02:38.139 CXX test/cpp_headers/cpuset.o 00:02:38.139 CXX test/cpp_headers/crc16.o 00:02:38.139 CXX test/cpp_headers/blob.o 00:02:38.139 CXX test/cpp_headers/crc32.o 00:02:38.139 CXX test/cpp_headers/crc64.o 00:02:38.139 CXX test/cpp_headers/dif.o 00:02:38.139 CXX test/cpp_headers/dma.o 00:02:38.139 CXX test/cpp_headers/endian.o 00:02:38.139 CXX test/cpp_headers/env_dpdk.o 00:02:38.139 CXX test/cpp_headers/event.o 00:02:38.139 CXX test/cpp_headers/env.o 00:02:38.139 CXX test/cpp_headers/fd_group.o 00:02:38.139 CXX test/cpp_headers/file.o 00:02:38.139 CXX test/cpp_headers/fd.o 00:02:38.139 CXX test/cpp_headers/ftl.o 00:02:38.139 CXX test/cpp_headers/gpt_spec.o 00:02:38.139 CXX test/cpp_headers/hexlify.o 00:02:38.139 CXX test/cpp_headers/idxd.o 00:02:38.139 CXX test/cpp_headers/idxd_spec.o 00:02:38.139 CXX test/cpp_headers/init.o 00:02:38.139 CXX test/cpp_headers/histogram_data.o 00:02:38.139 CXX test/cpp_headers/ioat.o 00:02:38.139 CXX test/cpp_headers/iscsi_spec.o 00:02:38.139 CXX test/cpp_headers/ioat_spec.o 00:02:38.139 CXX test/cpp_headers/jsonrpc.o 00:02:38.139 CXX test/cpp_headers/json.o 00:02:38.139 CXX test/cpp_headers/keyring.o 00:02:38.139 CC app/iscsi_tgt/iscsi_tgt.o 00:02:38.139 CC app/spdk_tgt/spdk_tgt.o 00:02:38.139 CXX test/cpp_headers/keyring_module.o 00:02:38.139 CC examples/util/zipf/zipf.o 00:02:38.139 CC test/app/histogram_perf/histogram_perf.o 00:02:38.139 CC test/env/vtophys/vtophys.o 00:02:38.139 CC test/env/pci/pci_ut.o 00:02:38.139 CC examples/ioat/verify/verify.o 00:02:38.139 CC test/app/jsoncat/jsoncat.o 00:02:38.139 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:38.139 CC app/fio/nvme/fio_plugin.o 00:02:38.139 CC test/app/stub/stub.o 00:02:38.139 CC examples/ioat/perf/perf.o 00:02:38.139 CC test/env/memory/memory_ut.o 00:02:38.139 CC test/thread/poller_perf/poller_perf.o 00:02:38.399 CC test/dma/test_dma/test_dma.o 00:02:38.399 CC test/app/bdev_svc/bdev_svc.o 00:02:38.399 CC app/fio/bdev/fio_plugin.o 00:02:38.399 CC test/env/mem_callbacks/mem_callbacks.o 00:02:38.399 LINK spdk_lspci 00:02:38.399 LINK nvmf_tgt 00:02:38.399 LINK rpc_client_test 00:02:38.663 LINK spdk_nvme_discover 00:02:38.663 LINK jsoncat 00:02:38.663 LINK spdk_trace_record 00:02:38.663 CXX test/cpp_headers/likely.o 00:02:38.663 CXX test/cpp_headers/log.o 00:02:38.663 CXX test/cpp_headers/lvol.o 00:02:38.663 LINK interrupt_tgt 00:02:38.663 CXX test/cpp_headers/memory.o 00:02:38.663 LINK vtophys 00:02:38.663 CXX test/cpp_headers/mmio.o 00:02:38.663 LINK histogram_perf 00:02:38.663 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:38.663 CXX test/cpp_headers/nbd.o 00:02:38.663 LINK zipf 00:02:38.663 CXX test/cpp_headers/notify.o 00:02:38.663 CXX test/cpp_headers/nvme.o 00:02:38.663 CXX test/cpp_headers/nvme_intel.o 00:02:38.663 CXX test/cpp_headers/nvme_ocssd.o 00:02:38.663 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:38.663 CXX test/cpp_headers/nvme_spec.o 00:02:38.663 CXX test/cpp_headers/nvme_zns.o 00:02:38.663 CXX test/cpp_headers/nvmf_cmd.o 00:02:38.663 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:38.663 LINK env_dpdk_post_init 00:02:38.663 CXX test/cpp_headers/nvmf.o 00:02:38.663 CXX test/cpp_headers/nvmf_spec.o 00:02:38.663 LINK poller_perf 00:02:38.663 LINK stub 00:02:38.663 CXX test/cpp_headers/nvmf_transport.o 00:02:38.663 LINK verify 00:02:38.663 CXX test/cpp_headers/opal.o 00:02:38.663 CXX test/cpp_headers/opal_spec.o 00:02:38.663 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:38.663 CXX test/cpp_headers/pci_ids.o 00:02:38.663 LINK bdev_svc 00:02:38.663 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:38.663 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:38.663 CXX test/cpp_headers/pipe.o 00:02:38.663 CXX test/cpp_headers/queue.o 00:02:38.663 CXX test/cpp_headers/reduce.o 00:02:38.663 CXX test/cpp_headers/rpc.o 00:02:38.663 CXX test/cpp_headers/scheduler.o 00:02:38.663 CXX test/cpp_headers/scsi.o 00:02:38.663 CXX test/cpp_headers/scsi_spec.o 00:02:38.663 CXX test/cpp_headers/sock.o 00:02:38.663 CXX test/cpp_headers/stdinc.o 00:02:38.663 CXX test/cpp_headers/string.o 00:02:38.663 CXX test/cpp_headers/thread.o 00:02:38.663 LINK iscsi_tgt 00:02:38.929 CXX test/cpp_headers/trace.o 00:02:38.929 CXX test/cpp_headers/trace_parser.o 00:02:38.929 CXX test/cpp_headers/tree.o 00:02:38.929 CXX test/cpp_headers/ublk.o 00:02:38.929 CXX test/cpp_headers/uuid.o 00:02:38.929 CXX test/cpp_headers/version.o 00:02:38.929 CXX test/cpp_headers/util.o 00:02:38.929 CXX test/cpp_headers/vfio_user_pci.o 00:02:38.929 LINK spdk_dd 00:02:38.929 CXX test/cpp_headers/vfio_user_spec.o 00:02:38.929 LINK ioat_perf 00:02:38.929 CXX test/cpp_headers/vhost.o 00:02:38.929 CXX test/cpp_headers/vmd.o 00:02:38.929 CXX test/cpp_headers/xor.o 00:02:38.929 CXX test/cpp_headers/zipf.o 00:02:39.187 LINK pci_ut 00:02:39.187 LINK spdk_trace 00:02:39.187 CC examples/idxd/perf/perf.o 00:02:39.187 CC examples/vmd/lsvmd/lsvmd.o 00:02:39.187 LINK spdk_tgt 00:02:39.187 CC examples/vmd/led/led.o 00:02:39.187 LINK spdk_nvme 00:02:39.187 CC examples/sock/hello_world/hello_sock.o 00:02:39.187 CC examples/thread/thread/thread_ex.o 00:02:39.187 LINK nvme_fuzz 00:02:39.446 LINK test_dma 00:02:39.446 CC test/event/event_perf/event_perf.o 00:02:39.446 CC test/event/reactor/reactor.o 00:02:39.446 CC test/event/reactor_perf/reactor_perf.o 00:02:39.446 CC test/event/app_repeat/app_repeat.o 00:02:39.446 LINK spdk_nvme_perf 00:02:39.446 LINK mem_callbacks 00:02:39.446 CC test/event/scheduler/scheduler.o 00:02:39.446 LINK spdk_bdev 00:02:39.446 LINK vhost_fuzz 00:02:39.446 LINK lsvmd 00:02:39.446 LINK led 00:02:39.446 LINK spdk_nvme_identify 00:02:39.446 LINK reactor 00:02:39.446 LINK spdk_top 00:02:39.446 CC app/vhost/vhost.o 00:02:39.446 LINK event_perf 00:02:39.446 LINK reactor_perf 00:02:39.446 LINK hello_sock 00:02:39.446 LINK idxd_perf 00:02:39.446 LINK app_repeat 00:02:39.705 LINK thread 00:02:39.705 LINK scheduler 00:02:39.705 LINK vhost 00:02:39.963 LINK memory_ut 00:02:39.963 CC test/nvme/aer/aer.o 00:02:39.963 CC test/nvme/e2edp/nvme_dp.o 00:02:39.963 CC test/nvme/cuse/cuse.o 00:02:39.963 CC test/nvme/reset/reset.o 00:02:39.963 CC test/nvme/startup/startup.o 00:02:39.963 CC test/nvme/overhead/overhead.o 00:02:39.963 CC test/nvme/simple_copy/simple_copy.o 00:02:39.963 CC test/nvme/sgl/sgl.o 00:02:39.963 CC test/nvme/compliance/nvme_compliance.o 00:02:39.963 CC test/nvme/boot_partition/boot_partition.o 00:02:39.963 CC test/nvme/err_injection/err_injection.o 00:02:39.963 CC test/nvme/connect_stress/connect_stress.o 00:02:39.963 CC test/nvme/fused_ordering/fused_ordering.o 00:02:39.963 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:39.963 CC test/nvme/fdp/fdp.o 00:02:39.963 CC test/nvme/reserve/reserve.o 00:02:39.963 CC test/accel/dif/dif.o 00:02:39.963 CC test/blobfs/mkfs/mkfs.o 00:02:39.963 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:39.963 CC examples/nvme/reconnect/reconnect.o 00:02:39.963 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:39.963 CC examples/nvme/hotplug/hotplug.o 00:02:39.963 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:39.963 CC examples/nvme/arbitration/arbitration.o 00:02:39.963 CC examples/nvme/hello_world/hello_world.o 00:02:39.963 CC examples/nvme/abort/abort.o 00:02:39.963 CC test/lvol/esnap/esnap.o 00:02:40.222 LINK connect_stress 00:02:40.222 CC examples/accel/perf/accel_perf.o 00:02:40.222 CC examples/blob/hello_world/hello_blob.o 00:02:40.222 LINK boot_partition 00:02:40.222 LINK startup 00:02:40.222 CC examples/blob/cli/blobcli.o 00:02:40.222 LINK fused_ordering 00:02:40.222 LINK doorbell_aers 00:02:40.222 LINK err_injection 00:02:40.222 LINK nvme_dp 00:02:40.222 LINK reserve 00:02:40.222 LINK mkfs 00:02:40.222 LINK reset 00:02:40.222 LINK sgl 00:02:40.222 LINK fdp 00:02:40.222 LINK pmr_persistence 00:02:40.222 LINK cmb_copy 00:02:40.222 LINK hello_world 00:02:40.222 LINK nvme_compliance 00:02:40.222 LINK overhead 00:02:40.222 LINK simple_copy 00:02:40.222 LINK hotplug 00:02:40.222 LINK aer 00:02:40.482 LINK reconnect 00:02:40.482 LINK hello_blob 00:02:40.482 LINK dif 00:02:40.482 LINK abort 00:02:40.482 LINK nvme_manage 00:02:40.741 LINK accel_perf 00:02:40.741 LINK blobcli 00:02:40.741 LINK iscsi_fuzz 00:02:40.741 LINK arbitration 00:02:41.000 CC test/bdev/bdevio/bdevio.o 00:02:41.259 CC examples/bdev/hello_world/hello_bdev.o 00:02:41.259 LINK cuse 00:02:41.259 CC examples/bdev/bdevperf/bdevperf.o 00:02:41.518 LINK bdevio 00:02:41.518 LINK hello_bdev 00:02:42.087 LINK bdevperf 00:02:42.655 CC examples/nvmf/nvmf/nvmf.o 00:02:43.223 LINK nvmf 00:02:45.126 LINK esnap 00:02:45.385 00:02:45.385 real 1m31.561s 00:02:45.385 user 17m23.838s 00:02:45.385 sys 4m13.480s 00:02:45.385 13:21:24 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:45.385 13:21:24 make -- common/autotest_common.sh@10 -- $ set +x 00:02:45.385 ************************************ 00:02:45.385 END TEST make 00:02:45.385 ************************************ 00:02:45.644 13:21:24 -- common/autotest_common.sh@1142 -- $ return 0 00:02:45.644 13:21:24 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:45.644 13:21:24 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:45.644 13:21:24 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:45.644 13:21:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:45.644 13:21:24 -- pm/common@44 -- $ pid=1910851 00:02:45.644 13:21:24 -- pm/common@50 -- $ kill -TERM 1910851 00:02:45.644 13:21:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:45.644 13:21:24 -- pm/common@44 -- $ pid=1910853 00:02:45.644 13:21:24 -- pm/common@50 -- $ kill -TERM 1910853 00:02:45.644 13:21:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:45.644 13:21:24 -- pm/common@44 -- $ pid=1910855 00:02:45.644 13:21:24 -- pm/common@50 -- $ kill -TERM 1910855 00:02:45.644 13:21:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:45.644 13:21:24 -- pm/common@44 -- $ pid=1910874 00:02:45.644 13:21:24 -- pm/common@50 -- $ sudo -E kill -TERM 1910874 00:02:45.644 13:21:24 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:45.644 13:21:24 -- nvmf/common.sh@7 -- # uname -s 00:02:45.644 13:21:24 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:45.644 13:21:24 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:45.644 13:21:24 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:45.644 13:21:24 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:45.644 13:21:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:45.644 13:21:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:45.644 13:21:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:45.644 13:21:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:45.644 13:21:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:45.644 13:21:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:45.644 13:21:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:02:45.644 13:21:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:02:45.644 13:21:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:45.644 13:21:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:45.644 13:21:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:45.644 13:21:24 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:45.644 13:21:24 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:45.644 13:21:24 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:45.644 13:21:24 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:45.644 13:21:24 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:45.644 13:21:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.644 13:21:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.644 13:21:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.644 13:21:24 -- paths/export.sh@5 -- # export PATH 00:02:45.644 13:21:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:45.644 13:21:24 -- nvmf/common.sh@47 -- # : 0 00:02:45.644 13:21:24 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:45.644 13:21:24 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:45.644 13:21:24 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:45.644 13:21:24 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:45.644 13:21:24 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:45.644 13:21:24 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:45.644 13:21:24 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:45.644 13:21:24 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:45.644 13:21:24 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:45.644 13:21:24 -- spdk/autotest.sh@32 -- # uname -s 00:02:45.644 13:21:24 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:45.644 13:21:24 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:45.644 13:21:24 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:45.644 13:21:24 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:45.644 13:21:24 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:45.644 13:21:24 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:45.644 13:21:24 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:45.644 13:21:24 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:45.644 13:21:24 -- spdk/autotest.sh@48 -- # udevadm_pid=1977831 00:02:45.644 13:21:24 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:45.644 13:21:24 -- pm/common@17 -- # local monitor 00:02:45.644 13:21:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:45.644 13:21:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:45.644 13:21:24 -- pm/common@25 -- # sleep 1 00:02:45.644 13:21:24 -- pm/common@21 -- # date +%s 00:02:45.644 13:21:24 -- pm/common@21 -- # date +%s 00:02:45.644 13:21:24 -- pm/common@21 -- # date +%s 00:02:45.644 13:21:24 -- pm/common@21 -- # date +%s 00:02:45.644 13:21:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042484 00:02:45.644 13:21:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042484 00:02:45.644 13:21:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042484 00:02:45.644 13:21:24 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721042484 00:02:45.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042484_collect-vmstat.pm.log 00:02:45.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042484_collect-cpu-load.pm.log 00:02:45.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042484_collect-cpu-temp.pm.log 00:02:45.644 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721042484_collect-bmc-pm.bmc.pm.log 00:02:46.579 13:21:25 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:46.579 13:21:25 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:46.579 13:21:25 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:46.579 13:21:25 -- common/autotest_common.sh@10 -- # set +x 00:02:46.579 13:21:25 -- spdk/autotest.sh@59 -- # create_test_list 00:02:46.579 13:21:25 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:46.579 13:21:25 -- common/autotest_common.sh@10 -- # set +x 00:02:46.838 13:21:26 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:46.838 13:21:26 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:46.838 13:21:26 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:46.838 13:21:26 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:46.838 13:21:26 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:46.838 13:21:26 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:46.838 13:21:26 -- common/autotest_common.sh@1455 -- # uname 00:02:46.838 13:21:26 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:46.838 13:21:26 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:46.838 13:21:26 -- common/autotest_common.sh@1475 -- # uname 00:02:46.838 13:21:26 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:46.838 13:21:26 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:46.838 13:21:26 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:46.838 13:21:26 -- spdk/autotest.sh@72 -- # hash lcov 00:02:46.838 13:21:26 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:46.838 13:21:26 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:46.838 --rc lcov_branch_coverage=1 00:02:46.838 --rc lcov_function_coverage=1 00:02:46.838 --rc genhtml_branch_coverage=1 00:02:46.838 --rc genhtml_function_coverage=1 00:02:46.838 --rc genhtml_legend=1 00:02:46.838 --rc geninfo_all_blocks=1 00:02:46.838 ' 00:02:46.838 13:21:26 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:46.838 --rc lcov_branch_coverage=1 00:02:46.838 --rc lcov_function_coverage=1 00:02:46.838 --rc genhtml_branch_coverage=1 00:02:46.838 --rc genhtml_function_coverage=1 00:02:46.838 --rc genhtml_legend=1 00:02:46.838 --rc geninfo_all_blocks=1 00:02:46.838 ' 00:02:46.838 13:21:26 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:46.838 --rc lcov_branch_coverage=1 00:02:46.838 --rc lcov_function_coverage=1 00:02:46.838 --rc genhtml_branch_coverage=1 00:02:46.838 --rc genhtml_function_coverage=1 00:02:46.838 --rc genhtml_legend=1 00:02:46.838 --rc geninfo_all_blocks=1 00:02:46.838 --no-external' 00:02:46.838 13:21:26 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:46.838 --rc lcov_branch_coverage=1 00:02:46.838 --rc lcov_function_coverage=1 00:02:46.838 --rc genhtml_branch_coverage=1 00:02:46.838 --rc genhtml_function_coverage=1 00:02:46.838 --rc genhtml_legend=1 00:02:46.838 --rc geninfo_all_blocks=1 00:02:46.838 --no-external' 00:02:46.838 13:21:26 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:46.838 lcov: LCOV version 1.14 00:02:46.838 13:21:26 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:04.954 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:04.954 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:17.153 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:17.153 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:17.153 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:17.153 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:17.153 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:17.153 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:17.153 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:17.153 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:17.153 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:17.154 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:17.154 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:17.414 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:17.414 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:17.673 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:17.673 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:21.859 13:22:00 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:21.859 13:22:00 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:21.859 13:22:00 -- common/autotest_common.sh@10 -- # set +x 00:03:21.859 13:22:00 -- spdk/autotest.sh@91 -- # rm -f 00:03:21.859 13:22:00 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.178 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:25.178 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:25.178 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:25.178 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:25.178 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:25.178 13:22:04 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:25.178 13:22:04 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:25.178 13:22:04 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:25.178 13:22:04 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:25.178 13:22:04 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:25.178 13:22:04 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:25.178 13:22:04 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:25.178 13:22:04 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:25.178 13:22:04 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:25.178 13:22:04 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:25.178 13:22:04 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:25.178 13:22:04 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:25.178 13:22:04 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:25.178 13:22:04 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:25.178 13:22:04 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:25.436 No valid GPT data, bailing 00:03:25.436 13:22:04 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:25.436 13:22:04 -- scripts/common.sh@391 -- # pt= 00:03:25.436 13:22:04 -- scripts/common.sh@392 -- # return 1 00:03:25.436 13:22:04 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:25.436 1+0 records in 00:03:25.436 1+0 records out 00:03:25.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00462709 s, 227 MB/s 00:03:25.436 13:22:04 -- spdk/autotest.sh@118 -- # sync 00:03:25.436 13:22:04 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:25.436 13:22:04 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:25.436 13:22:04 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:30.701 13:22:09 -- spdk/autotest.sh@124 -- # uname -s 00:03:30.701 13:22:09 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:30.701 13:22:09 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:30.701 13:22:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:30.701 13:22:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:30.701 13:22:09 -- common/autotest_common.sh@10 -- # set +x 00:03:30.701 ************************************ 00:03:30.701 START TEST setup.sh 00:03:30.701 ************************************ 00:03:30.701 13:22:09 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:30.701 * Looking for test storage... 00:03:30.701 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:30.701 13:22:09 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:30.701 13:22:09 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:30.701 13:22:09 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:30.701 13:22:09 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:30.701 13:22:09 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:30.701 13:22:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:30.701 ************************************ 00:03:30.701 START TEST acl 00:03:30.701 ************************************ 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:30.701 * Looking for test storage... 00:03:30.701 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:30.701 13:22:09 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:30.701 13:22:09 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:30.701 13:22:09 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:30.701 13:22:09 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.017 13:22:13 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:34.017 13:22:13 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:34.017 13:22:13 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.017 13:22:13 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:34.017 13:22:13 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.017 13:22:13 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 Hugepages 00:03:37.306 node hugesize free / total 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 00:03:37.306 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:37.306 13:22:16 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:37.306 13:22:16 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:37.307 13:22:16 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:37.307 13:22:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:37.307 ************************************ 00:03:37.307 START TEST denied 00:03:37.307 ************************************ 00:03:37.307 13:22:16 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:37.307 13:22:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:37.307 13:22:16 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:37.307 13:22:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:37.307 13:22:16 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.307 13:22:16 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:41.493 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.493 13:22:20 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:46.815 00:03:46.815 real 0m8.987s 00:03:46.815 user 0m2.836s 00:03:46.815 sys 0m5.444s 00:03:46.815 13:22:25 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:46.815 13:22:25 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:46.815 ************************************ 00:03:46.815 END TEST denied 00:03:46.815 ************************************ 00:03:46.815 13:22:25 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:46.815 13:22:25 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:46.815 13:22:25 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:46.815 13:22:25 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.815 13:22:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:46.815 ************************************ 00:03:46.815 START TEST allowed 00:03:46.815 ************************************ 00:03:46.815 13:22:25 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:03:46.815 13:22:25 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:46.815 13:22:25 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:46.815 13:22:25 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:46.815 13:22:25 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.815 13:22:25 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:53.385 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:03:53.385 13:22:31 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:53.385 13:22:31 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:53.385 13:22:31 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:53.385 13:22:31 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:53.385 13:22:31 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:56.673 00:03:56.673 real 0m9.680s 00:03:56.673 user 0m2.453s 00:03:56.673 sys 0m4.596s 00:03:56.673 13:22:35 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:56.673 13:22:35 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:56.673 ************************************ 00:03:56.673 END TEST allowed 00:03:56.673 ************************************ 00:03:56.673 13:22:35 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:03:56.673 00:03:56.673 real 0m26.113s 00:03:56.673 user 0m7.904s 00:03:56.673 sys 0m15.075s 00:03:56.673 13:22:35 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:03:56.673 13:22:35 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:56.673 ************************************ 00:03:56.673 END TEST acl 00:03:56.673 ************************************ 00:03:56.673 13:22:35 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:03:56.673 13:22:35 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:56.673 13:22:35 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:56.673 13:22:35 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:56.673 13:22:35 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:56.673 ************************************ 00:03:56.673 START TEST hugepages 00:03:56.673 ************************************ 00:03:56.673 13:22:35 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:56.673 * Looking for test storage... 00:03:56.673 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76739304 kB' 'MemAvailable: 80038824 kB' 'Buffers: 12176 kB' 'Cached: 9472692 kB' 'SwapCached: 0 kB' 'Active: 6529252 kB' 'Inactive: 3456260 kB' 'Active(anon): 6135668 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503984 kB' 'Mapped: 189408 kB' 'Shmem: 5635024 kB' 'KReclaimable: 206636 kB' 'Slab: 532464 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325828 kB' 'KernelStack: 16160 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7558068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.673 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.674 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:56.675 13:22:35 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:56.675 13:22:35 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:56.675 13:22:35 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:56.675 13:22:35 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:56.675 ************************************ 00:03:56.675 START TEST default_setup 00:03:56.675 ************************************ 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.675 13:22:35 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:59.994 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:59.994 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:59.994 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:59.994 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:00.253 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:02.796 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78898692 kB' 'MemAvailable: 82198212 kB' 'Buffers: 12176 kB' 'Cached: 9472812 kB' 'SwapCached: 0 kB' 'Active: 6548328 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154744 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523132 kB' 'Mapped: 189736 kB' 'Shmem: 5635144 kB' 'KReclaimable: 206636 kB' 'Slab: 531408 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324772 kB' 'KernelStack: 16464 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7575748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201240 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78895060 kB' 'MemAvailable: 82194580 kB' 'Buffers: 12176 kB' 'Cached: 9472812 kB' 'SwapCached: 0 kB' 'Active: 6549544 kB' 'Inactive: 3456260 kB' 'Active(anon): 6155960 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524376 kB' 'Mapped: 189660 kB' 'Shmem: 5635144 kB' 'KReclaimable: 206636 kB' 'Slab: 531392 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324756 kB' 'KernelStack: 16736 kB' 'PageTables: 9156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7575768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201608 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.797 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78893328 kB' 'MemAvailable: 82192848 kB' 'Buffers: 12176 kB' 'Cached: 9472812 kB' 'SwapCached: 0 kB' 'Active: 6551056 kB' 'Inactive: 3456260 kB' 'Active(anon): 6157472 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525888 kB' 'Mapped: 189660 kB' 'Shmem: 5635144 kB' 'KReclaimable: 206636 kB' 'Slab: 531904 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325268 kB' 'KernelStack: 17120 kB' 'PageTables: 10164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7575788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201896 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.799 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:02.800 nr_hugepages=1024 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:02.800 resv_hugepages=0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:02.800 surplus_hugepages=0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:02.800 anon_hugepages=0 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78892048 kB' 'MemAvailable: 82191568 kB' 'Buffers: 12176 kB' 'Cached: 9472812 kB' 'SwapCached: 0 kB' 'Active: 6550864 kB' 'Inactive: 3456260 kB' 'Active(anon): 6157280 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525600 kB' 'Mapped: 189652 kB' 'Shmem: 5635144 kB' 'KReclaimable: 206636 kB' 'Slab: 531904 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325268 kB' 'KernelStack: 17056 kB' 'PageTables: 10372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7575812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201128 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.800 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:41 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.801 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36536868 kB' 'MemUsed: 11580072 kB' 'SwapCached: 0 kB' 'Active: 5375896 kB' 'Inactive: 3372048 kB' 'Active(anon): 5217992 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468256 kB' 'Mapped: 90492 kB' 'AnonPages: 282808 kB' 'Shmem: 4938304 kB' 'KernelStack: 9128 kB' 'PageTables: 4716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331504 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.802 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:02.803 node0=1024 expecting 1024 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:02.803 00:04:02.803 real 0m6.285s 00:04:02.803 user 0m1.485s 00:04:02.803 sys 0m2.437s 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:02.803 13:22:42 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:02.803 ************************************ 00:04:02.803 END TEST default_setup 00:04:02.803 ************************************ 00:04:02.803 13:22:42 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:02.803 13:22:42 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:02.803 13:22:42 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:02.803 13:22:42 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:02.803 13:22:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:02.803 ************************************ 00:04:02.803 START TEST per_node_1G_alloc 00:04:02.803 ************************************ 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.803 13:22:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:06.091 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:06.091 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:06.091 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:06.091 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:06.091 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78893984 kB' 'MemAvailable: 82193504 kB' 'Buffers: 12176 kB' 'Cached: 9472948 kB' 'SwapCached: 0 kB' 'Active: 6548160 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154576 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522124 kB' 'Mapped: 189704 kB' 'Shmem: 5635280 kB' 'KReclaimable: 206636 kB' 'Slab: 531588 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324952 kB' 'KernelStack: 16176 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7573660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.091 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.092 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78896504 kB' 'MemAvailable: 82196024 kB' 'Buffers: 12176 kB' 'Cached: 9472948 kB' 'SwapCached: 0 kB' 'Active: 6548816 kB' 'Inactive: 3456260 kB' 'Active(anon): 6155232 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522784 kB' 'Mapped: 189704 kB' 'Shmem: 5635280 kB' 'KReclaimable: 206636 kB' 'Slab: 531588 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324952 kB' 'KernelStack: 16192 kB' 'PageTables: 8356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7573680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.093 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78896504 kB' 'MemAvailable: 82196024 kB' 'Buffers: 12176 kB' 'Cached: 9472948 kB' 'SwapCached: 0 kB' 'Active: 6549416 kB' 'Inactive: 3456260 kB' 'Active(anon): 6155832 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523440 kB' 'Mapped: 189704 kB' 'Shmem: 5635280 kB' 'KReclaimable: 206636 kB' 'Slab: 531588 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324952 kB' 'KernelStack: 16208 kB' 'PageTables: 8408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7573704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201048 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.094 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.095 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:06.096 nr_hugepages=1024 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:06.096 resv_hugepages=0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:06.096 surplus_hugepages=0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:06.096 anon_hugepages=0 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.096 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78896800 kB' 'MemAvailable: 82196320 kB' 'Buffers: 12176 kB' 'Cached: 9472952 kB' 'SwapCached: 0 kB' 'Active: 6549952 kB' 'Inactive: 3456260 kB' 'Active(anon): 6156368 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523948 kB' 'Mapped: 189688 kB' 'Shmem: 5635284 kB' 'KReclaimable: 206636 kB' 'Slab: 531588 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324952 kB' 'KernelStack: 16208 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7573724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.097 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37578360 kB' 'MemUsed: 10538580 kB' 'SwapCached: 0 kB' 'Active: 5374920 kB' 'Inactive: 3372048 kB' 'Active(anon): 5217016 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468348 kB' 'Mapped: 90480 kB' 'AnonPages: 281788 kB' 'Shmem: 4938396 kB' 'KernelStack: 8888 kB' 'PageTables: 4280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331452 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205076 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.098 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.099 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41317796 kB' 'MemUsed: 2858736 kB' 'SwapCached: 0 kB' 'Active: 1172484 kB' 'Inactive: 84212 kB' 'Active(anon): 936804 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1016860 kB' 'Mapped: 99128 kB' 'AnonPages: 239952 kB' 'Shmem: 696968 kB' 'KernelStack: 7256 kB' 'PageTables: 3896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80260 kB' 'Slab: 200096 kB' 'SReclaimable: 80260 kB' 'SUnreclaim: 119836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.100 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:06.101 node0=512 expecting 512 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:06.101 node1=512 expecting 512 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:06.101 00:04:06.101 real 0m3.033s 00:04:06.101 user 0m0.961s 00:04:06.101 sys 0m1.921s 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:06.101 13:22:45 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:06.101 ************************************ 00:04:06.101 END TEST per_node_1G_alloc 00:04:06.101 ************************************ 00:04:06.101 13:22:45 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:06.101 13:22:45 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:06.101 13:22:45 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:06.101 13:22:45 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:06.101 13:22:45 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:06.101 ************************************ 00:04:06.101 START TEST even_2G_alloc 00:04:06.101 ************************************ 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.101 13:22:45 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:09.386 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:09.386 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:09.386 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:09.386 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.386 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78911392 kB' 'MemAvailable: 82210912 kB' 'Buffers: 12176 kB' 'Cached: 9473104 kB' 'SwapCached: 0 kB' 'Active: 6554684 kB' 'Inactive: 3456260 kB' 'Active(anon): 6161100 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528692 kB' 'Mapped: 189040 kB' 'Shmem: 5635436 kB' 'KReclaimable: 206636 kB' 'Slab: 531992 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325356 kB' 'KernelStack: 16656 kB' 'PageTables: 9152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7577852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201260 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.386 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.387 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78924176 kB' 'MemAvailable: 82223696 kB' 'Buffers: 12176 kB' 'Cached: 9473108 kB' 'SwapCached: 0 kB' 'Active: 6547932 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154348 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522388 kB' 'Mapped: 188956 kB' 'Shmem: 5635440 kB' 'KReclaimable: 206636 kB' 'Slab: 531864 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325228 kB' 'KernelStack: 16240 kB' 'PageTables: 8764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.388 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.389 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78924060 kB' 'MemAvailable: 82223580 kB' 'Buffers: 12176 kB' 'Cached: 9473124 kB' 'SwapCached: 0 kB' 'Active: 6547056 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153472 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521360 kB' 'Mapped: 189016 kB' 'Shmem: 5635456 kB' 'KReclaimable: 206636 kB' 'Slab: 531868 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325232 kB' 'KernelStack: 16176 kB' 'PageTables: 7900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.390 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:09.391 nr_hugepages=1024 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:09.391 resv_hugepages=0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:09.391 surplus_hugepages=0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:09.391 anon_hugepages=0 00:04:09.391 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78924172 kB' 'MemAvailable: 82223692 kB' 'Buffers: 12176 kB' 'Cached: 9473148 kB' 'SwapCached: 0 kB' 'Active: 6547076 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153492 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521352 kB' 'Mapped: 189016 kB' 'Shmem: 5635480 kB' 'KReclaimable: 206636 kB' 'Slab: 531868 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325232 kB' 'KernelStack: 16224 kB' 'PageTables: 7884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.392 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37578304 kB' 'MemUsed: 10538636 kB' 'SwapCached: 0 kB' 'Active: 5375768 kB' 'Inactive: 3372048 kB' 'Active(anon): 5217864 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468452 kB' 'Mapped: 90548 kB' 'AnonPages: 282688 kB' 'Shmem: 4938500 kB' 'KernelStack: 8920 kB' 'PageTables: 4328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331960 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205584 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.393 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.394 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41342340 kB' 'MemUsed: 2834192 kB' 'SwapCached: 0 kB' 'Active: 1171616 kB' 'Inactive: 84212 kB' 'Active(anon): 935936 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1016892 kB' 'Mapped: 98468 kB' 'AnonPages: 238964 kB' 'Shmem: 697000 kB' 'KernelStack: 7336 kB' 'PageTables: 3880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80260 kB' 'Slab: 199908 kB' 'SReclaimable: 80260 kB' 'SUnreclaim: 119648 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.395 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:09.396 node0=512 expecting 512 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:09.396 node1=512 expecting 512 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:09.396 00:04:09.396 real 0m3.552s 00:04:09.396 user 0m1.369s 00:04:09.396 sys 0m2.262s 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:09.396 13:22:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:09.396 ************************************ 00:04:09.396 END TEST even_2G_alloc 00:04:09.396 ************************************ 00:04:09.656 13:22:48 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:09.656 13:22:48 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:09.656 13:22:48 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:09.656 13:22:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.656 13:22:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:09.656 ************************************ 00:04:09.656 START TEST odd_alloc 00:04:09.656 ************************************ 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.656 13:22:48 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:12.944 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:12.944 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:12.944 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.944 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:12.944 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.944 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.206 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78902100 kB' 'MemAvailable: 82201620 kB' 'Buffers: 12176 kB' 'Cached: 9473248 kB' 'SwapCached: 0 kB' 'Active: 6547316 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153732 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521356 kB' 'Mapped: 188624 kB' 'Shmem: 5635580 kB' 'KReclaimable: 206636 kB' 'Slab: 531732 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325096 kB' 'KernelStack: 16112 kB' 'PageTables: 8060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7569528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.206 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.207 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78902704 kB' 'MemAvailable: 82202224 kB' 'Buffers: 12176 kB' 'Cached: 9473252 kB' 'SwapCached: 0 kB' 'Active: 6547164 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153580 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521248 kB' 'Mapped: 188584 kB' 'Shmem: 5635584 kB' 'KReclaimable: 206636 kB' 'Slab: 531752 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325116 kB' 'KernelStack: 16112 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7570856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.208 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.209 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78903304 kB' 'MemAvailable: 82202824 kB' 'Buffers: 12176 kB' 'Cached: 9473268 kB' 'SwapCached: 0 kB' 'Active: 6547192 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153608 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521812 kB' 'Mapped: 188608 kB' 'Shmem: 5635600 kB' 'KReclaimable: 206636 kB' 'Slab: 531748 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325112 kB' 'KernelStack: 16192 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7570684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.210 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.473 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:13.474 nr_hugepages=1025 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.474 resv_hugepages=0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.474 surplus_hugepages=0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.474 anon_hugepages=0 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78904888 kB' 'MemAvailable: 82204408 kB' 'Buffers: 12176 kB' 'Cached: 9473288 kB' 'SwapCached: 0 kB' 'Active: 6547352 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153768 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521400 kB' 'Mapped: 188608 kB' 'Shmem: 5635620 kB' 'KReclaimable: 206636 kB' 'Slab: 531740 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325104 kB' 'KernelStack: 16256 kB' 'PageTables: 8164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7572192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.474 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37568156 kB' 'MemUsed: 10548784 kB' 'SwapCached: 0 kB' 'Active: 5376280 kB' 'Inactive: 3372048 kB' 'Active(anon): 5218376 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468560 kB' 'Mapped: 90292 kB' 'AnonPages: 283008 kB' 'Shmem: 4938608 kB' 'KernelStack: 8952 kB' 'PageTables: 4484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331604 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205228 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41334688 kB' 'MemUsed: 2841844 kB' 'SwapCached: 0 kB' 'Active: 1171240 kB' 'Inactive: 84212 kB' 'Active(anon): 935560 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1016928 kB' 'Mapped: 98316 kB' 'AnonPages: 238556 kB' 'Shmem: 697036 kB' 'KernelStack: 7256 kB' 'PageTables: 3792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80260 kB' 'Slab: 200136 kB' 'SReclaimable: 80260 kB' 'SUnreclaim: 119876 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.475 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:13.476 node0=512 expecting 513 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:13.476 node1=513 expecting 512 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:13.476 00:04:13.476 real 0m3.868s 00:04:13.476 user 0m1.511s 00:04:13.476 sys 0m2.463s 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:13.476 13:22:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:13.476 ************************************ 00:04:13.476 END TEST odd_alloc 00:04:13.476 ************************************ 00:04:13.476 13:22:52 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:13.476 13:22:52 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:13.476 13:22:52 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:13.476 13:22:52 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.476 13:22:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:13.476 ************************************ 00:04:13.476 START TEST custom_alloc 00:04:13.476 ************************************ 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.476 13:22:52 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:17.671 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:17.671 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:17.671 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:17.671 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.671 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77824340 kB' 'MemAvailable: 81123860 kB' 'Buffers: 12176 kB' 'Cached: 9473400 kB' 'SwapCached: 0 kB' 'Active: 6547540 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153956 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521456 kB' 'Mapped: 188600 kB' 'Shmem: 5635732 kB' 'KReclaimable: 206636 kB' 'Slab: 531688 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325052 kB' 'KernelStack: 16160 kB' 'PageTables: 8100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7569952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.671 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.672 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77824792 kB' 'MemAvailable: 81124312 kB' 'Buffers: 12176 kB' 'Cached: 9473404 kB' 'SwapCached: 0 kB' 'Active: 6547452 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153868 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521312 kB' 'Mapped: 188600 kB' 'Shmem: 5635736 kB' 'KReclaimable: 206636 kB' 'Slab: 531672 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325036 kB' 'KernelStack: 16096 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7569972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.673 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.674 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77824036 kB' 'MemAvailable: 81123556 kB' 'Buffers: 12176 kB' 'Cached: 9473440 kB' 'SwapCached: 0 kB' 'Active: 6547108 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153524 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520952 kB' 'Mapped: 188600 kB' 'Shmem: 5635772 kB' 'KReclaimable: 206636 kB' 'Slab: 531672 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325036 kB' 'KernelStack: 16096 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7569992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.675 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.676 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:17.677 nr_hugepages=1536 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.677 resv_hugepages=0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.677 surplus_hugepages=0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.677 anon_hugepages=0 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77825024 kB' 'MemAvailable: 81124544 kB' 'Buffers: 12176 kB' 'Cached: 9473468 kB' 'SwapCached: 0 kB' 'Active: 6547940 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154356 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521836 kB' 'Mapped: 188600 kB' 'Shmem: 5635800 kB' 'KReclaimable: 206636 kB' 'Slab: 531672 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 325036 kB' 'KernelStack: 16128 kB' 'PageTables: 8056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7571508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.677 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.678 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37556564 kB' 'MemUsed: 10560376 kB' 'SwapCached: 0 kB' 'Active: 5377068 kB' 'Inactive: 3372048 kB' 'Active(anon): 5219164 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468700 kB' 'Mapped: 90304 kB' 'AnonPages: 283688 kB' 'Shmem: 4938748 kB' 'KernelStack: 8936 kB' 'PageTables: 4428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331412 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205036 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.679 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.680 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40271280 kB' 'MemUsed: 3905252 kB' 'SwapCached: 0 kB' 'Active: 1171536 kB' 'Inactive: 84212 kB' 'Active(anon): 935856 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1016948 kB' 'Mapped: 98320 kB' 'AnonPages: 238892 kB' 'Shmem: 697056 kB' 'KernelStack: 7320 kB' 'PageTables: 4020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80260 kB' 'Slab: 200260 kB' 'SReclaimable: 80260 kB' 'SUnreclaim: 120000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.681 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:17.682 node0=512 expecting 512 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:17.682 node1=1024 expecting 1024 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:17.682 00:04:17.682 real 0m3.852s 00:04:17.682 user 0m1.582s 00:04:17.682 sys 0m2.366s 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:17.682 13:22:56 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:17.682 ************************************ 00:04:17.682 END TEST custom_alloc 00:04:17.682 ************************************ 00:04:17.682 13:22:56 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:17.682 13:22:56 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:17.682 13:22:56 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:17.682 13:22:56 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:17.682 13:22:56 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:17.682 ************************************ 00:04:17.682 START TEST no_shrink_alloc 00:04:17.682 ************************************ 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.682 13:22:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:21.018 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:21.018 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:21.018 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:21.018 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:21.018 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.018 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78863212 kB' 'MemAvailable: 82162732 kB' 'Buffers: 12176 kB' 'Cached: 9473556 kB' 'SwapCached: 0 kB' 'Active: 6548080 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154496 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521920 kB' 'Mapped: 188620 kB' 'Shmem: 5635888 kB' 'KReclaimable: 206636 kB' 'Slab: 531536 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324900 kB' 'KernelStack: 16144 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.019 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78863316 kB' 'MemAvailable: 82162836 kB' 'Buffers: 12176 kB' 'Cached: 9473560 kB' 'SwapCached: 0 kB' 'Active: 6547648 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154064 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521436 kB' 'Mapped: 188612 kB' 'Shmem: 5635892 kB' 'KReclaimable: 206636 kB' 'Slab: 531520 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324884 kB' 'KernelStack: 16112 kB' 'PageTables: 8028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.020 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.021 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78863316 kB' 'MemAvailable: 82162836 kB' 'Buffers: 12176 kB' 'Cached: 9473596 kB' 'SwapCached: 0 kB' 'Active: 6547156 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153572 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520928 kB' 'Mapped: 188612 kB' 'Shmem: 5635928 kB' 'KReclaimable: 206636 kB' 'Slab: 531520 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324884 kB' 'KernelStack: 16112 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.022 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.023 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:21.024 nr_hugepages=1024 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:21.024 resv_hugepages=0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:21.024 surplus_hugepages=0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:21.024 anon_hugepages=0 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78866816 kB' 'MemAvailable: 82166336 kB' 'Buffers: 12176 kB' 'Cached: 9473600 kB' 'SwapCached: 0 kB' 'Active: 6547440 kB' 'Inactive: 3456260 kB' 'Active(anon): 6153856 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521188 kB' 'Mapped: 188612 kB' 'Shmem: 5635932 kB' 'KReclaimable: 206636 kB' 'Slab: 531520 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324884 kB' 'KernelStack: 16144 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.024 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.025 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36511892 kB' 'MemUsed: 11605048 kB' 'SwapCached: 0 kB' 'Active: 5375344 kB' 'Inactive: 3372048 kB' 'Active(anon): 5217440 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468812 kB' 'Mapped: 90800 kB' 'AnonPages: 281716 kB' 'Shmem: 4938860 kB' 'KernelStack: 8936 kB' 'PageTables: 4384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331484 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 205108 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.286 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:21.287 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:21.288 node0=1024 expecting 1024 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.288 13:23:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:24.580 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:24.580 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:24.580 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:24.580 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:24.580 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:24.581 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:24.581 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:24.581 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78834088 kB' 'MemAvailable: 82133608 kB' 'Buffers: 12176 kB' 'Cached: 9473684 kB' 'SwapCached: 0 kB' 'Active: 6547844 kB' 'Inactive: 3456260 kB' 'Active(anon): 6154260 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521432 kB' 'Mapped: 189140 kB' 'Shmem: 5636016 kB' 'KReclaimable: 206636 kB' 'Slab: 531404 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324768 kB' 'KernelStack: 16128 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7573372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.581 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.582 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78831972 kB' 'MemAvailable: 82131492 kB' 'Buffers: 12176 kB' 'Cached: 9473684 kB' 'SwapCached: 0 kB' 'Active: 6550428 kB' 'Inactive: 3456260 kB' 'Active(anon): 6156844 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524008 kB' 'Mapped: 189120 kB' 'Shmem: 5636016 kB' 'KReclaimable: 206636 kB' 'Slab: 531404 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324768 kB' 'KernelStack: 16128 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7576028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.583 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.584 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78828576 kB' 'MemAvailable: 82128096 kB' 'Buffers: 12176 kB' 'Cached: 9473704 kB' 'SwapCached: 0 kB' 'Active: 6552144 kB' 'Inactive: 3456260 kB' 'Active(anon): 6158560 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525760 kB' 'Mapped: 189452 kB' 'Shmem: 5636036 kB' 'KReclaimable: 206636 kB' 'Slab: 531532 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324896 kB' 'KernelStack: 16128 kB' 'PageTables: 8100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7577780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.585 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.586 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.587 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.588 nr_hugepages=1024 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.588 resv_hugepages=0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.588 surplus_hugepages=0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.588 anon_hugepages=0 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78836396 kB' 'MemAvailable: 82135916 kB' 'Buffers: 12176 kB' 'Cached: 9473724 kB' 'SwapCached: 0 kB' 'Active: 6546488 kB' 'Inactive: 3456260 kB' 'Active(anon): 6152904 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520136 kB' 'Mapped: 188620 kB' 'Shmem: 5636056 kB' 'KReclaimable: 206636 kB' 'Slab: 531532 kB' 'SReclaimable: 206636 kB' 'SUnreclaim: 324896 kB' 'KernelStack: 16128 kB' 'PageTables: 8080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7571680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 863652 kB' 'DirectMap2M: 13492224 kB' 'DirectMap1G: 87031808 kB' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.588 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.589 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36504224 kB' 'MemUsed: 11612716 kB' 'SwapCached: 0 kB' 'Active: 5374724 kB' 'Inactive: 3372048 kB' 'Active(anon): 5216820 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8468880 kB' 'Mapped: 90300 kB' 'AnonPages: 281040 kB' 'Shmem: 4938928 kB' 'KernelStack: 8936 kB' 'PageTables: 4392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 126376 kB' 'Slab: 331220 kB' 'SReclaimable: 126376 kB' 'SUnreclaim: 204844 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.590 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.591 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:24.592 node0=1024 expecting 1024 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:24.592 00:04:24.592 real 0m7.144s 00:04:24.592 user 0m2.805s 00:04:24.592 sys 0m4.485s 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:24.592 13:23:03 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:24.592 ************************************ 00:04:24.592 END TEST no_shrink_alloc 00:04:24.592 ************************************ 00:04:24.592 13:23:03 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:24.592 13:23:03 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:24.592 00:04:24.592 real 0m28.390s 00:04:24.592 user 0m9.964s 00:04:24.592 sys 0m16.387s 00:04:24.592 13:23:03 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:24.592 13:23:03 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:24.592 ************************************ 00:04:24.592 END TEST hugepages 00:04:24.592 ************************************ 00:04:24.592 13:23:03 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:24.592 13:23:03 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:24.592 13:23:03 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:24.592 13:23:03 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.592 13:23:03 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:24.851 ************************************ 00:04:24.851 START TEST driver 00:04:24.851 ************************************ 00:04:24.851 13:23:04 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:24.851 * Looking for test storage... 00:04:24.851 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:24.851 13:23:04 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:24.851 13:23:04 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:24.851 13:23:04 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:30.120 13:23:09 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:30.120 13:23:09 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:30.120 13:23:09 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:30.120 13:23:09 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:30.120 ************************************ 00:04:30.120 START TEST guess_driver 00:04:30.120 ************************************ 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:30.120 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:30.120 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:30.120 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:30.120 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:30.120 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:30.121 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:30.121 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:30.121 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:30.121 Looking for driver=vfio-pci 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.121 13:23:09 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.409 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.667 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:33.926 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:33.926 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:33.926 13:23:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.459 13:23:15 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.735 00:04:41.735 real 0m11.263s 00:04:41.735 user 0m2.927s 00:04:41.735 sys 0m5.273s 00:04:41.735 13:23:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.735 13:23:20 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:41.735 ************************************ 00:04:41.735 END TEST guess_driver 00:04:41.735 ************************************ 00:04:41.735 13:23:20 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:41.735 00:04:41.735 real 0m16.370s 00:04:41.735 user 0m4.461s 00:04:41.735 sys 0m8.055s 00:04:41.735 13:23:20 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.735 13:23:20 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:41.735 ************************************ 00:04:41.735 END TEST driver 00:04:41.735 ************************************ 00:04:41.735 13:23:20 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:41.735 13:23:20 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:41.735 13:23:20 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:41.735 13:23:20 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.735 13:23:20 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:41.735 ************************************ 00:04:41.735 START TEST devices 00:04:41.735 ************************************ 00:04:41.735 13:23:20 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:41.735 * Looking for test storage... 00:04:41.735 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:41.735 13:23:20 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:41.735 13:23:20 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:41.735 13:23:20 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:41.735 13:23:20 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:45.927 13:23:24 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:45.927 No valid GPT data, bailing 00:04:45.927 13:23:24 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:45.927 13:23:24 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:45.927 13:23:24 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:45.927 13:23:24 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:45.927 ************************************ 00:04:45.927 START TEST nvme_mount 00:04:45.927 ************************************ 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:45.927 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:45.928 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:45.928 13:23:24 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:46.529 Creating new GPT entries in memory. 00:04:46.529 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:46.529 other utilities. 00:04:46.529 13:23:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:46.529 13:23:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:46.529 13:23:25 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:46.529 13:23:25 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:46.529 13:23:25 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:47.467 Creating new GPT entries in memory. 00:04:47.467 The operation has completed successfully. 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2010302 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.467 13:23:26 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:50.755 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:50.755 13:23:29 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:50.755 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:50.755 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:04:50.755 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:50.755 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:50.755 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:50.755 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:50.755 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.755 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:50.755 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.015 13:23:30 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.300 13:23:33 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.590 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:57.591 13:23:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:57.850 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:57.850 00:04:57.850 real 0m12.471s 00:04:57.850 user 0m3.605s 00:04:57.850 sys 0m6.741s 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:57.850 13:23:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:57.850 ************************************ 00:04:57.850 END TEST nvme_mount 00:04:57.850 ************************************ 00:04:57.850 13:23:37 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:04:57.850 13:23:37 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:57.850 13:23:37 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:57.850 13:23:37 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:57.850 13:23:37 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:57.850 ************************************ 00:04:57.850 START TEST dm_mount 00:04:57.850 ************************************ 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:57.850 13:23:37 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:58.787 Creating new GPT entries in memory. 00:04:58.787 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:58.787 other utilities. 00:04:58.787 13:23:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:58.787 13:23:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:58.787 13:23:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:58.787 13:23:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:58.787 13:23:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:00.168 Creating new GPT entries in memory. 00:05:00.168 The operation has completed successfully. 00:05:00.168 13:23:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:00.168 13:23:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:00.168 13:23:39 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:00.168 13:23:39 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:00.168 13:23:39 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:01.105 The operation has completed successfully. 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2014398 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:01.105 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.106 13:23:40 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:04.399 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.658 13:23:43 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:07.946 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:08.205 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:08.205 00:05:08.205 real 0m10.411s 00:05:08.205 user 0m2.588s 00:05:08.205 sys 0m4.843s 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.205 13:23:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:08.205 ************************************ 00:05:08.205 END TEST dm_mount 00:05:08.205 ************************************ 00:05:08.205 13:23:47 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:08.205 13:23:47 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:08.205 13:23:47 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:08.205 13:23:47 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:08.465 13:23:47 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.465 13:23:47 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:08.465 13:23:47 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:08.465 13:23:47 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:08.759 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:08.759 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:08.759 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:08.759 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:08.759 13:23:47 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:08.759 00:05:08.759 real 0m27.429s 00:05:08.759 user 0m7.797s 00:05:08.759 sys 0m14.447s 00:05:08.759 13:23:47 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.759 13:23:47 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:08.759 ************************************ 00:05:08.759 END TEST devices 00:05:08.759 ************************************ 00:05:08.759 13:23:47 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:08.759 00:05:08.759 real 1m38.718s 00:05:08.759 user 0m30.274s 00:05:08.759 sys 0m54.268s 00:05:08.759 13:23:47 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.759 13:23:47 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:08.759 ************************************ 00:05:08.759 END TEST setup.sh 00:05:08.759 ************************************ 00:05:08.759 13:23:47 -- common/autotest_common.sh@1142 -- # return 0 00:05:08.759 13:23:47 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:12.097 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:12.097 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:12.097 Hugepages 00:05:12.097 node hugesize free / total 00:05:12.097 node0 1048576kB 0 / 0 00:05:12.097 node0 2048kB 1024 / 1024 00:05:12.097 node1 1048576kB 0 / 0 00:05:12.097 node1 2048kB 1024 / 1024 00:05:12.097 00:05:12.097 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:12.097 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:12.097 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:12.356 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:12.356 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:12.356 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:12.356 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:12.356 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:12.356 13:23:51 -- spdk/autotest.sh@130 -- # uname -s 00:05:12.356 13:23:51 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:12.356 13:23:51 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:12.356 13:23:51 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:15.642 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:15.642 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:15.642 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:15.642 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:15.643 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:15.643 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:15.643 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:15.901 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.431 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:18.431 13:23:57 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:19.366 13:23:58 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:19.366 13:23:58 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:19.366 13:23:58 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:19.366 13:23:58 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:19.366 13:23:58 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:19.366 13:23:58 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:19.366 13:23:58 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:19.366 13:23:58 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:19.366 13:23:58 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:19.366 13:23:58 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:19.366 13:23:58 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:19.366 13:23:58 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:22.655 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:22.655 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:22.655 Waiting for block devices as requested 00:05:22.655 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:22.655 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:22.915 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:22.915 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:22.915 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.175 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.175 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:23.175 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:23.434 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:23.434 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:23.434 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:23.694 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:23.694 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.694 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.951 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:23.951 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:23.951 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:23.951 13:24:03 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:23.951 13:24:03 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:24.210 13:24:03 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:24.210 13:24:03 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:24.210 13:24:03 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:24.210 13:24:03 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:24.210 13:24:03 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:24.210 13:24:03 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:24.210 13:24:03 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:24.210 13:24:03 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:24.210 13:24:03 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:24.210 13:24:03 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:24.210 13:24:03 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:24.210 13:24:03 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:24.210 13:24:03 -- common/autotest_common.sh@1557 -- # continue 00:05:24.210 13:24:03 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:24.210 13:24:03 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:24.210 13:24:03 -- common/autotest_common.sh@10 -- # set +x 00:05:24.210 13:24:03 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:24.210 13:24:03 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:24.210 13:24:03 -- common/autotest_common.sh@10 -- # set +x 00:05:24.210 13:24:03 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:28.401 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:28.401 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:28.401 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.401 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.937 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:30.937 13:24:09 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:30.937 13:24:09 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:30.937 13:24:09 -- common/autotest_common.sh@10 -- # set +x 00:05:30.937 13:24:10 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:30.937 13:24:10 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:30.937 13:24:10 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:30.937 13:24:10 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:30.937 13:24:10 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:30.937 13:24:10 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:30.937 13:24:10 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:30.937 13:24:10 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:30.937 13:24:10 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:30.937 13:24:10 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:30.937 13:24:10 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:30.937 13:24:10 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:30.937 13:24:10 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:30.937 13:24:10 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:30.937 13:24:10 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:30.937 13:24:10 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:05:30.937 13:24:10 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:30.937 13:24:10 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:05:30.937 13:24:10 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:05:30.937 13:24:10 -- common/autotest_common.sh@1593 -- # return 0 00:05:30.937 13:24:10 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:30.937 13:24:10 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:30.937 13:24:10 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:30.937 13:24:10 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:30.937 13:24:10 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:31.544 Restarting all devices. 00:05:35.734 lstat() error: No such file or directory 00:05:35.734 QAT Error: No GENERAL section found 00:05:35.734 Failed to configure qat_dev0 00:05:35.734 lstat() error: No such file or directory 00:05:35.734 QAT Error: No GENERAL section found 00:05:35.734 Failed to configure qat_dev1 00:05:35.734 lstat() error: No such file or directory 00:05:35.734 QAT Error: No GENERAL section found 00:05:35.734 Failed to configure qat_dev2 00:05:35.734 enable sriov 00:05:35.734 Checking status of all devices. 00:05:35.734 There is 3 QAT acceleration device(s) in the system: 00:05:35.734 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:35.734 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:35.734 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:36.668 0000:3d:00.0 set to 16 VFs 00:05:38.057 0000:3f:00.0 set to 16 VFs 00:05:39.432 0000:da:00.0 set to 16 VFs 00:05:42.736 Properly configured the qat device with driver uio_pci_generic. 00:05:42.736 13:24:21 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:42.736 13:24:21 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:42.736 13:24:21 -- common/autotest_common.sh@10 -- # set +x 00:05:42.736 13:24:21 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:42.736 13:24:21 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:42.736 13:24:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:42.736 13:24:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.736 13:24:21 -- common/autotest_common.sh@10 -- # set +x 00:05:42.736 ************************************ 00:05:42.736 START TEST env 00:05:42.736 ************************************ 00:05:42.736 13:24:21 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:42.736 * Looking for test storage... 00:05:42.736 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:42.736 13:24:21 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:42.736 13:24:21 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:42.736 13:24:21 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.736 13:24:21 env -- common/autotest_common.sh@10 -- # set +x 00:05:42.736 ************************************ 00:05:42.736 START TEST env_memory 00:05:42.736 ************************************ 00:05:42.736 13:24:21 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:42.736 00:05:42.736 00:05:42.736 CUnit - A unit testing framework for C - Version 2.1-3 00:05:42.736 http://cunit.sourceforge.net/ 00:05:42.736 00:05:42.736 00:05:42.736 Suite: memory 00:05:42.736 Test: alloc and free memory map ...[2024-07-15 13:24:22.023033] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:42.736 passed 00:05:42.736 Test: mem map translation ...[2024-07-15 13:24:22.052363] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:42.736 [2024-07-15 13:24:22.052388] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:42.736 [2024-07-15 13:24:22.052444] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:42.736 [2024-07-15 13:24:22.052458] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:42.736 passed 00:05:42.736 Test: mem map registration ...[2024-07-15 13:24:22.110294] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:42.736 [2024-07-15 13:24:22.110317] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:42.736 passed 00:05:42.997 Test: mem map adjacent registrations ...passed 00:05:42.997 00:05:42.997 Run Summary: Type Total Ran Passed Failed Inactive 00:05:42.997 suites 1 1 n/a 0 0 00:05:42.997 tests 4 4 4 0 0 00:05:42.997 asserts 152 152 152 0 n/a 00:05:42.997 00:05:42.997 Elapsed time = 0.201 seconds 00:05:42.997 00:05:42.997 real 0m0.216s 00:05:42.997 user 0m0.205s 00:05:42.997 sys 0m0.010s 00:05:42.997 13:24:22 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:42.997 13:24:22 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:42.997 ************************************ 00:05:42.997 END TEST env_memory 00:05:42.997 ************************************ 00:05:42.997 13:24:22 env -- common/autotest_common.sh@1142 -- # return 0 00:05:42.997 13:24:22 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:42.997 13:24:22 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:42.997 13:24:22 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.997 13:24:22 env -- common/autotest_common.sh@10 -- # set +x 00:05:42.997 ************************************ 00:05:42.997 START TEST env_vtophys 00:05:42.997 ************************************ 00:05:42.997 13:24:22 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:42.997 EAL: lib.eal log level changed from notice to debug 00:05:42.998 EAL: Detected lcore 0 as core 0 on socket 0 00:05:42.998 EAL: Detected lcore 1 as core 1 on socket 0 00:05:42.998 EAL: Detected lcore 2 as core 2 on socket 0 00:05:42.998 EAL: Detected lcore 3 as core 3 on socket 0 00:05:42.998 EAL: Detected lcore 4 as core 4 on socket 0 00:05:42.998 EAL: Detected lcore 5 as core 8 on socket 0 00:05:42.998 EAL: Detected lcore 6 as core 9 on socket 0 00:05:42.998 EAL: Detected lcore 7 as core 10 on socket 0 00:05:42.998 EAL: Detected lcore 8 as core 11 on socket 0 00:05:42.998 EAL: Detected lcore 9 as core 16 on socket 0 00:05:42.998 EAL: Detected lcore 10 as core 17 on socket 0 00:05:42.998 EAL: Detected lcore 11 as core 18 on socket 0 00:05:42.998 EAL: Detected lcore 12 as core 19 on socket 0 00:05:42.998 EAL: Detected lcore 13 as core 20 on socket 0 00:05:42.998 EAL: Detected lcore 14 as core 24 on socket 0 00:05:42.998 EAL: Detected lcore 15 as core 25 on socket 0 00:05:42.998 EAL: Detected lcore 16 as core 26 on socket 0 00:05:42.998 EAL: Detected lcore 17 as core 27 on socket 0 00:05:42.998 EAL: Detected lcore 18 as core 0 on socket 1 00:05:42.998 EAL: Detected lcore 19 as core 1 on socket 1 00:05:42.998 EAL: Detected lcore 20 as core 2 on socket 1 00:05:42.998 EAL: Detected lcore 21 as core 3 on socket 1 00:05:42.998 EAL: Detected lcore 22 as core 4 on socket 1 00:05:42.998 EAL: Detected lcore 23 as core 8 on socket 1 00:05:42.998 EAL: Detected lcore 24 as core 9 on socket 1 00:05:42.998 EAL: Detected lcore 25 as core 10 on socket 1 00:05:42.998 EAL: Detected lcore 26 as core 11 on socket 1 00:05:42.998 EAL: Detected lcore 27 as core 16 on socket 1 00:05:42.998 EAL: Detected lcore 28 as core 17 on socket 1 00:05:42.998 EAL: Detected lcore 29 as core 18 on socket 1 00:05:42.998 EAL: Detected lcore 30 as core 19 on socket 1 00:05:42.998 EAL: Detected lcore 31 as core 20 on socket 1 00:05:42.998 EAL: Detected lcore 32 as core 24 on socket 1 00:05:42.998 EAL: Detected lcore 33 as core 25 on socket 1 00:05:42.998 EAL: Detected lcore 34 as core 26 on socket 1 00:05:42.998 EAL: Detected lcore 35 as core 27 on socket 1 00:05:42.998 EAL: Detected lcore 36 as core 0 on socket 0 00:05:42.998 EAL: Detected lcore 37 as core 1 on socket 0 00:05:42.998 EAL: Detected lcore 38 as core 2 on socket 0 00:05:42.998 EAL: Detected lcore 39 as core 3 on socket 0 00:05:42.998 EAL: Detected lcore 40 as core 4 on socket 0 00:05:42.998 EAL: Detected lcore 41 as core 8 on socket 0 00:05:42.998 EAL: Detected lcore 42 as core 9 on socket 0 00:05:42.998 EAL: Detected lcore 43 as core 10 on socket 0 00:05:42.998 EAL: Detected lcore 44 as core 11 on socket 0 00:05:42.998 EAL: Detected lcore 45 as core 16 on socket 0 00:05:42.998 EAL: Detected lcore 46 as core 17 on socket 0 00:05:42.998 EAL: Detected lcore 47 as core 18 on socket 0 00:05:42.998 EAL: Detected lcore 48 as core 19 on socket 0 00:05:42.998 EAL: Detected lcore 49 as core 20 on socket 0 00:05:42.998 EAL: Detected lcore 50 as core 24 on socket 0 00:05:42.998 EAL: Detected lcore 51 as core 25 on socket 0 00:05:42.998 EAL: Detected lcore 52 as core 26 on socket 0 00:05:42.998 EAL: Detected lcore 53 as core 27 on socket 0 00:05:42.998 EAL: Detected lcore 54 as core 0 on socket 1 00:05:42.998 EAL: Detected lcore 55 as core 1 on socket 1 00:05:42.998 EAL: Detected lcore 56 as core 2 on socket 1 00:05:42.998 EAL: Detected lcore 57 as core 3 on socket 1 00:05:42.998 EAL: Detected lcore 58 as core 4 on socket 1 00:05:42.998 EAL: Detected lcore 59 as core 8 on socket 1 00:05:42.998 EAL: Detected lcore 60 as core 9 on socket 1 00:05:42.998 EAL: Detected lcore 61 as core 10 on socket 1 00:05:42.998 EAL: Detected lcore 62 as core 11 on socket 1 00:05:42.998 EAL: Detected lcore 63 as core 16 on socket 1 00:05:42.998 EAL: Detected lcore 64 as core 17 on socket 1 00:05:42.998 EAL: Detected lcore 65 as core 18 on socket 1 00:05:42.998 EAL: Detected lcore 66 as core 19 on socket 1 00:05:42.998 EAL: Detected lcore 67 as core 20 on socket 1 00:05:42.998 EAL: Detected lcore 68 as core 24 on socket 1 00:05:42.998 EAL: Detected lcore 69 as core 25 on socket 1 00:05:42.998 EAL: Detected lcore 70 as core 26 on socket 1 00:05:42.998 EAL: Detected lcore 71 as core 27 on socket 1 00:05:42.998 EAL: Maximum logical cores by configuration: 128 00:05:42.998 EAL: Detected CPU lcores: 72 00:05:42.998 EAL: Detected NUMA nodes: 2 00:05:42.998 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:42.998 EAL: Detected shared linkage of DPDK 00:05:42.998 EAL: No shared files mode enabled, IPC will be disabled 00:05:42.998 EAL: No shared files mode enabled, IPC is disabled 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:05:42.998 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:05:42.998 EAL: Bus pci wants IOVA as 'PA' 00:05:42.998 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:42.998 EAL: Bus vdev wants IOVA as 'DC' 00:05:42.998 EAL: Selected IOVA mode 'PA' 00:05:42.998 EAL: Probing VFIO support... 00:05:42.998 EAL: IOMMU type 1 (Type 1) is supported 00:05:42.998 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:42.998 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:42.998 EAL: VFIO support initialized 00:05:42.998 EAL: Ask a virtual area of 0x2e000 bytes 00:05:42.998 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:42.998 EAL: Setting up physically contiguous memory... 00:05:42.998 EAL: Setting maximum number of open files to 524288 00:05:42.998 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:42.998 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:42.998 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:42.998 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:42.998 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.998 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:42.998 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:42.998 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.998 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:42.998 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:42.999 EAL: Ask a virtual area of 0x61000 bytes 00:05:42.999 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:42.999 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:42.999 EAL: Ask a virtual area of 0x400000000 bytes 00:05:42.999 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:42.999 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:42.999 EAL: Hugepages will be freed exactly as allocated. 00:05:42.999 EAL: No shared files mode enabled, IPC is disabled 00:05:42.999 EAL: No shared files mode enabled, IPC is disabled 00:05:42.999 EAL: TSC frequency is ~2300000 KHz 00:05:42.999 EAL: Main lcore 0 is ready (tid=7f71be9c7b00;cpuset=[0]) 00:05:42.999 EAL: Trying to obtain current memory policy. 00:05:42.999 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:42.999 EAL: Restoring previous memory policy: 0 00:05:42.999 EAL: request: mp_malloc_sync 00:05:42.999 EAL: No shared files mode enabled, IPC is disabled 00:05:42.999 EAL: Heap on socket 0 was expanded by 2MB 00:05:42.999 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001000000 00:05:42.999 EAL: PCI memory mapped at 0x202001001000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001002000 00:05:42.999 EAL: PCI memory mapped at 0x202001003000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001004000 00:05:42.999 EAL: PCI memory mapped at 0x202001005000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001006000 00:05:42.999 EAL: PCI memory mapped at 0x202001007000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001008000 00:05:42.999 EAL: PCI memory mapped at 0x202001009000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200100a000 00:05:42.999 EAL: PCI memory mapped at 0x20200100b000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200100c000 00:05:42.999 EAL: PCI memory mapped at 0x20200100d000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200100e000 00:05:42.999 EAL: PCI memory mapped at 0x20200100f000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001010000 00:05:42.999 EAL: PCI memory mapped at 0x202001011000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001012000 00:05:42.999 EAL: PCI memory mapped at 0x202001013000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001014000 00:05:42.999 EAL: PCI memory mapped at 0x202001015000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001016000 00:05:42.999 EAL: PCI memory mapped at 0x202001017000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001018000 00:05:42.999 EAL: PCI memory mapped at 0x202001019000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200101a000 00:05:42.999 EAL: PCI memory mapped at 0x20200101b000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200101c000 00:05:42.999 EAL: PCI memory mapped at 0x20200101d000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:42.999 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200101e000 00:05:42.999 EAL: PCI memory mapped at 0x20200101f000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001020000 00:05:42.999 EAL: PCI memory mapped at 0x202001021000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001022000 00:05:42.999 EAL: PCI memory mapped at 0x202001023000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001024000 00:05:42.999 EAL: PCI memory mapped at 0x202001025000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001026000 00:05:42.999 EAL: PCI memory mapped at 0x202001027000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001028000 00:05:42.999 EAL: PCI memory mapped at 0x202001029000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200102a000 00:05:42.999 EAL: PCI memory mapped at 0x20200102b000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200102c000 00:05:42.999 EAL: PCI memory mapped at 0x20200102d000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200102e000 00:05:42.999 EAL: PCI memory mapped at 0x20200102f000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001030000 00:05:42.999 EAL: PCI memory mapped at 0x202001031000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001032000 00:05:42.999 EAL: PCI memory mapped at 0x202001033000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001034000 00:05:42.999 EAL: PCI memory mapped at 0x202001035000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001036000 00:05:42.999 EAL: PCI memory mapped at 0x202001037000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001038000 00:05:42.999 EAL: PCI memory mapped at 0x202001039000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200103a000 00:05:42.999 EAL: PCI memory mapped at 0x20200103b000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200103c000 00:05:42.999 EAL: PCI memory mapped at 0x20200103d000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:42.999 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x20200103e000 00:05:42.999 EAL: PCI memory mapped at 0x20200103f000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:42.999 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001040000 00:05:42.999 EAL: PCI memory mapped at 0x202001041000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:42.999 EAL: Trying to obtain current memory policy. 00:05:42.999 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:42.999 EAL: Restoring previous memory policy: 4 00:05:42.999 EAL: request: mp_malloc_sync 00:05:42.999 EAL: No shared files mode enabled, IPC is disabled 00:05:42.999 EAL: Heap on socket 1 was expanded by 2MB 00:05:42.999 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001042000 00:05:42.999 EAL: PCI memory mapped at 0x202001043000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:42.999 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:05:42.999 EAL: probe driver: 8086:37c9 qat 00:05:42.999 EAL: PCI memory mapped at 0x202001044000 00:05:42.999 EAL: PCI memory mapped at 0x202001045000 00:05:42.999 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:42.999 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001046000 00:05:43.000 EAL: PCI memory mapped at 0x202001047000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001048000 00:05:43.000 EAL: PCI memory mapped at 0x202001049000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200104a000 00:05:43.000 EAL: PCI memory mapped at 0x20200104b000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200104c000 00:05:43.000 EAL: PCI memory mapped at 0x20200104d000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200104e000 00:05:43.000 EAL: PCI memory mapped at 0x20200104f000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001050000 00:05:43.000 EAL: PCI memory mapped at 0x202001051000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001052000 00:05:43.000 EAL: PCI memory mapped at 0x202001053000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001054000 00:05:43.000 EAL: PCI memory mapped at 0x202001055000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001056000 00:05:43.000 EAL: PCI memory mapped at 0x202001057000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x202001058000 00:05:43.000 EAL: PCI memory mapped at 0x202001059000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200105a000 00:05:43.000 EAL: PCI memory mapped at 0x20200105b000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200105c000 00:05:43.000 EAL: PCI memory mapped at 0x20200105d000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:43.000 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:05:43.000 EAL: probe driver: 8086:37c9 qat 00:05:43.000 EAL: PCI memory mapped at 0x20200105e000 00:05:43.000 EAL: PCI memory mapped at 0x20200105f000 00:05:43.000 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:43.000 EAL: No shared files mode enabled, IPC is disabled 00:05:43.000 EAL: No shared files mode enabled, IPC is disabled 00:05:43.000 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:43.259 EAL: Mem event callback 'spdk:(nil)' registered 00:05:43.259 00:05:43.259 00:05:43.259 CUnit - A unit testing framework for C - Version 2.1-3 00:05:43.259 http://cunit.sourceforge.net/ 00:05:43.259 00:05:43.259 00:05:43.259 Suite: components_suite 00:05:43.259 Test: vtophys_malloc_test ...passed 00:05:43.259 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 4MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 4MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 6MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 6MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 10MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 10MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 18MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 18MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 34MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 34MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 66MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 66MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 130MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was shrunk by 130MB 00:05:43.259 EAL: Trying to obtain current memory policy. 00:05:43.259 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.259 EAL: Restoring previous memory policy: 4 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.259 EAL: request: mp_malloc_sync 00:05:43.259 EAL: No shared files mode enabled, IPC is disabled 00:05:43.259 EAL: Heap on socket 0 was expanded by 258MB 00:05:43.259 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.519 EAL: request: mp_malloc_sync 00:05:43.519 EAL: No shared files mode enabled, IPC is disabled 00:05:43.519 EAL: Heap on socket 0 was shrunk by 258MB 00:05:43.519 EAL: Trying to obtain current memory policy. 00:05:43.519 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.519 EAL: Restoring previous memory policy: 4 00:05:43.519 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.519 EAL: request: mp_malloc_sync 00:05:43.519 EAL: No shared files mode enabled, IPC is disabled 00:05:43.519 EAL: Heap on socket 0 was expanded by 514MB 00:05:43.777 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.777 EAL: request: mp_malloc_sync 00:05:43.777 EAL: No shared files mode enabled, IPC is disabled 00:05:43.777 EAL: Heap on socket 0 was shrunk by 514MB 00:05:43.778 EAL: Trying to obtain current memory policy. 00:05:43.778 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.034 EAL: Restoring previous memory policy: 4 00:05:44.034 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.034 EAL: request: mp_malloc_sync 00:05:44.034 EAL: No shared files mode enabled, IPC is disabled 00:05:44.034 EAL: Heap on socket 0 was expanded by 1026MB 00:05:44.292 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.292 EAL: request: mp_malloc_sync 00:05:44.292 EAL: No shared files mode enabled, IPC is disabled 00:05:44.292 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:44.292 passed 00:05:44.292 00:05:44.292 Run Summary: Type Total Ran Passed Failed Inactive 00:05:44.292 suites 1 1 n/a 0 0 00:05:44.292 tests 2 2 2 0 0 00:05:44.292 asserts 5568 5568 5568 0 n/a 00:05:44.292 00:05:44.292 Elapsed time = 1.205 seconds 00:05:44.292 EAL: No shared files mode enabled, IPC is disabled 00:05:44.292 EAL: No shared files mode enabled, IPC is disabled 00:05:44.292 EAL: No shared files mode enabled, IPC is disabled 00:05:44.292 00:05:44.292 real 0m1.406s 00:05:44.292 user 0m0.789s 00:05:44.292 sys 0m0.586s 00:05:44.292 13:24:23 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:44.292 13:24:23 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:44.292 ************************************ 00:05:44.292 END TEST env_vtophys 00:05:44.292 ************************************ 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1142 -- # return 0 00:05:44.550 13:24:23 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.550 13:24:23 env -- common/autotest_common.sh@10 -- # set +x 00:05:44.550 ************************************ 00:05:44.550 START TEST env_pci 00:05:44.550 ************************************ 00:05:44.550 13:24:23 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:44.550 00:05:44.550 00:05:44.550 CUnit - A unit testing framework for C - Version 2.1-3 00:05:44.550 http://cunit.sourceforge.net/ 00:05:44.550 00:05:44.550 00:05:44.550 Suite: pci 00:05:44.550 Test: pci_hook ...[2024-07-15 13:24:23.774529] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2026026 has claimed it 00:05:44.550 EAL: Cannot find device (10000:00:01.0) 00:05:44.550 EAL: Failed to attach device on primary process 00:05:44.550 passed 00:05:44.550 00:05:44.550 Run Summary: Type Total Ran Passed Failed Inactive 00:05:44.550 suites 1 1 n/a 0 0 00:05:44.550 tests 1 1 1 0 0 00:05:44.550 asserts 25 25 25 0 n/a 00:05:44.550 00:05:44.550 Elapsed time = 0.042 seconds 00:05:44.550 00:05:44.550 real 0m0.069s 00:05:44.550 user 0m0.018s 00:05:44.550 sys 0m0.050s 00:05:44.550 13:24:23 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:44.550 13:24:23 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:44.550 ************************************ 00:05:44.550 END TEST env_pci 00:05:44.550 ************************************ 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1142 -- # return 0 00:05:44.550 13:24:23 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:44.550 13:24:23 env -- env/env.sh@15 -- # uname 00:05:44.550 13:24:23 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:44.550 13:24:23 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:44.550 13:24:23 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:44.550 13:24:23 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.550 13:24:23 env -- common/autotest_common.sh@10 -- # set +x 00:05:44.550 ************************************ 00:05:44.550 START TEST env_dpdk_post_init 00:05:44.550 ************************************ 00:05:44.550 13:24:23 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:44.550 EAL: Detected CPU lcores: 72 00:05:44.550 EAL: Detected NUMA nodes: 2 00:05:44.550 EAL: Detected shared linkage of DPDK 00:05:44.550 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:44.810 EAL: Selected IOVA mode 'PA' 00:05:44.810 EAL: VFIO support initialized 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:44.810 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:44.810 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:44.811 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:44.811 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:44.811 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:44.811 EAL: Using IOMMU type 1 (Type 1) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:44.811 EAL: Ignore mapping IO port bar(1) 00:05:44.811 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:45.071 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:05:45.071 EAL: Ignore mapping IO port bar(1) 00:05:45.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:45.071 EAL: Ignore mapping IO port bar(1) 00:05:45.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:45.071 EAL: Ignore mapping IO port bar(1) 00:05:45.071 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Ignore mapping IO port bar(5) 00:05:45.331 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:05:45.331 EAL: Ignore mapping IO port bar(1) 00:05:45.331 EAL: Ignore mapping IO port bar(5) 00:05:45.331 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:05:48.614 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:48.614 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:48.614 Starting DPDK initialization... 00:05:48.614 Starting SPDK post initialization... 00:05:48.614 SPDK NVMe probe 00:05:48.614 Attaching to 0000:5e:00.0 00:05:48.614 Attached to 0000:5e:00.0 00:05:48.614 Cleaning up... 00:05:48.614 00:05:48.614 real 0m3.511s 00:05:48.614 user 0m2.385s 00:05:48.614 sys 0m0.679s 00:05:48.614 13:24:27 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.614 13:24:27 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:48.614 ************************************ 00:05:48.614 END TEST env_dpdk_post_init 00:05:48.614 ************************************ 00:05:48.614 13:24:27 env -- common/autotest_common.sh@1142 -- # return 0 00:05:48.614 13:24:27 env -- env/env.sh@26 -- # uname 00:05:48.614 13:24:27 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:48.614 13:24:27 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:48.614 13:24:27 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.614 13:24:27 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.614 13:24:27 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.614 ************************************ 00:05:48.614 START TEST env_mem_callbacks 00:05:48.614 ************************************ 00:05:48.614 13:24:27 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:48.614 EAL: Detected CPU lcores: 72 00:05:48.614 EAL: Detected NUMA nodes: 2 00:05:48.614 EAL: Detected shared linkage of DPDK 00:05:48.614 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:48.614 EAL: Selected IOVA mode 'PA' 00:05:48.614 EAL: VFIO support initialized 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.614 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:48.614 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.614 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.615 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:48.615 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.615 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:48.616 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:48.616 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:48.616 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:48.616 00:05:48.616 00:05:48.616 CUnit - A unit testing framework for C - Version 2.1-3 00:05:48.616 http://cunit.sourceforge.net/ 00:05:48.616 00:05:48.616 00:05:48.616 Suite: memory 00:05:48.616 Test: test ... 00:05:48.616 register 0x200000200000 2097152 00:05:48.616 register 0x201000a00000 2097152 00:05:48.616 malloc 3145728 00:05:48.616 register 0x200000400000 4194304 00:05:48.616 buf 0x200000500000 len 3145728 PASSED 00:05:48.616 malloc 64 00:05:48.616 buf 0x2000004fff40 len 64 PASSED 00:05:48.616 malloc 4194304 00:05:48.616 register 0x200000800000 6291456 00:05:48.616 buf 0x200000a00000 len 4194304 PASSED 00:05:48.616 free 0x200000500000 3145728 00:05:48.616 free 0x2000004fff40 64 00:05:48.616 unregister 0x200000400000 4194304 PASSED 00:05:48.616 free 0x200000a00000 4194304 00:05:48.616 unregister 0x200000800000 6291456 PASSED 00:05:48.616 malloc 8388608 00:05:48.616 register 0x200000400000 10485760 00:05:48.616 buf 0x200000600000 len 8388608 PASSED 00:05:48.616 free 0x200000600000 8388608 00:05:48.616 unregister 0x200000400000 10485760 PASSED 00:05:48.616 passed 00:05:48.616 00:05:48.616 Run Summary: Type Total Ran Passed Failed Inactive 00:05:48.616 suites 1 1 n/a 0 0 00:05:48.616 tests 1 1 1 0 0 00:05:48.616 asserts 16 16 16 0 n/a 00:05:48.616 00:05:48.616 Elapsed time = 0.006 seconds 00:05:48.616 00:05:48.616 real 0m0.113s 00:05:48.616 user 0m0.031s 00:05:48.616 sys 0m0.081s 00:05:48.616 13:24:27 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.616 13:24:27 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:48.616 ************************************ 00:05:48.616 END TEST env_mem_callbacks 00:05:48.616 ************************************ 00:05:48.616 13:24:27 env -- common/autotest_common.sh@1142 -- # return 0 00:05:48.616 00:05:48.616 real 0m5.829s 00:05:48.616 user 0m3.602s 00:05:48.616 sys 0m1.787s 00:05:48.616 13:24:27 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:48.616 13:24:27 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.616 ************************************ 00:05:48.616 END TEST env 00:05:48.616 ************************************ 00:05:48.616 13:24:27 -- common/autotest_common.sh@1142 -- # return 0 00:05:48.616 13:24:27 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:48.616 13:24:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:48.616 13:24:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.616 13:24:27 -- common/autotest_common.sh@10 -- # set +x 00:05:48.616 ************************************ 00:05:48.616 START TEST rpc 00:05:48.616 ************************************ 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:48.616 * Looking for test storage... 00:05:48.616 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:48.616 13:24:27 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2026677 00:05:48.616 13:24:27 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:48.616 13:24:27 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:48.616 13:24:27 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2026677 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@829 -- # '[' -z 2026677 ']' 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.616 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:48.616 13:24:27 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.616 [2024-07-15 13:24:27.898410] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:05:48.616 [2024-07-15 13:24:27.898481] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2026677 ] 00:05:48.616 [2024-07-15 13:24:28.029742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.874 [2024-07-15 13:24:28.129776] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:48.874 [2024-07-15 13:24:28.129829] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2026677' to capture a snapshot of events at runtime. 00:05:48.874 [2024-07-15 13:24:28.129844] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:48.874 [2024-07-15 13:24:28.129856] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:48.874 [2024-07-15 13:24:28.129867] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2026677 for offline analysis/debug. 00:05:48.874 [2024-07-15 13:24:28.129897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.439 13:24:28 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:49.439 13:24:28 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:49.439 13:24:28 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:49.439 13:24:28 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:49.439 13:24:28 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:49.439 13:24:28 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:49.439 13:24:28 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.439 13:24:28 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.439 13:24:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:49.439 ************************************ 00:05:49.439 START TEST rpc_integrity 00:05:49.439 ************************************ 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:49.439 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.439 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.697 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.697 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:49.697 { 00:05:49.697 "name": "Malloc0", 00:05:49.697 "aliases": [ 00:05:49.697 "dc9e55d4-d7ae-4f77-9b01-aeeba1865096" 00:05:49.697 ], 00:05:49.697 "product_name": "Malloc disk", 00:05:49.697 "block_size": 512, 00:05:49.697 "num_blocks": 16384, 00:05:49.697 "uuid": "dc9e55d4-d7ae-4f77-9b01-aeeba1865096", 00:05:49.697 "assigned_rate_limits": { 00:05:49.697 "rw_ios_per_sec": 0, 00:05:49.697 "rw_mbytes_per_sec": 0, 00:05:49.697 "r_mbytes_per_sec": 0, 00:05:49.697 "w_mbytes_per_sec": 0 00:05:49.697 }, 00:05:49.697 "claimed": false, 00:05:49.697 "zoned": false, 00:05:49.697 "supported_io_types": { 00:05:49.697 "read": true, 00:05:49.697 "write": true, 00:05:49.697 "unmap": true, 00:05:49.697 "flush": true, 00:05:49.697 "reset": true, 00:05:49.697 "nvme_admin": false, 00:05:49.697 "nvme_io": false, 00:05:49.697 "nvme_io_md": false, 00:05:49.697 "write_zeroes": true, 00:05:49.697 "zcopy": true, 00:05:49.697 "get_zone_info": false, 00:05:49.697 "zone_management": false, 00:05:49.697 "zone_append": false, 00:05:49.697 "compare": false, 00:05:49.697 "compare_and_write": false, 00:05:49.697 "abort": true, 00:05:49.697 "seek_hole": false, 00:05:49.697 "seek_data": false, 00:05:49.697 "copy": true, 00:05:49.697 "nvme_iov_md": false 00:05:49.697 }, 00:05:49.697 "memory_domains": [ 00:05:49.697 { 00:05:49.697 "dma_device_id": "system", 00:05:49.697 "dma_device_type": 1 00:05:49.697 }, 00:05:49.697 { 00:05:49.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:49.697 "dma_device_type": 2 00:05:49.697 } 00:05:49.697 ], 00:05:49.697 "driver_specific": {} 00:05:49.697 } 00:05:49.697 ]' 00:05:49.697 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:49.697 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:49.697 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:49.697 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.697 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.697 [2024-07-15 13:24:28.928642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:49.698 [2024-07-15 13:24:28.928688] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:49.698 [2024-07-15 13:24:28.928707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b06eb0 00:05:49.698 [2024-07-15 13:24:28.928720] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:49.698 [2024-07-15 13:24:28.930296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:49.698 [2024-07-15 13:24:28.930326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:49.698 Passthru0 00:05:49.698 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.698 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:49.698 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.698 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.698 13:24:28 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.698 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:49.698 { 00:05:49.698 "name": "Malloc0", 00:05:49.698 "aliases": [ 00:05:49.698 "dc9e55d4-d7ae-4f77-9b01-aeeba1865096" 00:05:49.698 ], 00:05:49.698 "product_name": "Malloc disk", 00:05:49.698 "block_size": 512, 00:05:49.698 "num_blocks": 16384, 00:05:49.698 "uuid": "dc9e55d4-d7ae-4f77-9b01-aeeba1865096", 00:05:49.698 "assigned_rate_limits": { 00:05:49.698 "rw_ios_per_sec": 0, 00:05:49.698 "rw_mbytes_per_sec": 0, 00:05:49.698 "r_mbytes_per_sec": 0, 00:05:49.698 "w_mbytes_per_sec": 0 00:05:49.698 }, 00:05:49.698 "claimed": true, 00:05:49.698 "claim_type": "exclusive_write", 00:05:49.698 "zoned": false, 00:05:49.698 "supported_io_types": { 00:05:49.698 "read": true, 00:05:49.698 "write": true, 00:05:49.698 "unmap": true, 00:05:49.698 "flush": true, 00:05:49.698 "reset": true, 00:05:49.698 "nvme_admin": false, 00:05:49.698 "nvme_io": false, 00:05:49.698 "nvme_io_md": false, 00:05:49.698 "write_zeroes": true, 00:05:49.698 "zcopy": true, 00:05:49.698 "get_zone_info": false, 00:05:49.698 "zone_management": false, 00:05:49.698 "zone_append": false, 00:05:49.698 "compare": false, 00:05:49.698 "compare_and_write": false, 00:05:49.698 "abort": true, 00:05:49.698 "seek_hole": false, 00:05:49.698 "seek_data": false, 00:05:49.698 "copy": true, 00:05:49.698 "nvme_iov_md": false 00:05:49.698 }, 00:05:49.698 "memory_domains": [ 00:05:49.698 { 00:05:49.698 "dma_device_id": "system", 00:05:49.698 "dma_device_type": 1 00:05:49.698 }, 00:05:49.698 { 00:05:49.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:49.698 "dma_device_type": 2 00:05:49.698 } 00:05:49.698 ], 00:05:49.698 "driver_specific": {} 00:05:49.698 }, 00:05:49.698 { 00:05:49.698 "name": "Passthru0", 00:05:49.698 "aliases": [ 00:05:49.698 "fbc1f9a4-121b-564c-b1f0-29c539a17856" 00:05:49.698 ], 00:05:49.698 "product_name": "passthru", 00:05:49.698 "block_size": 512, 00:05:49.698 "num_blocks": 16384, 00:05:49.698 "uuid": "fbc1f9a4-121b-564c-b1f0-29c539a17856", 00:05:49.698 "assigned_rate_limits": { 00:05:49.698 "rw_ios_per_sec": 0, 00:05:49.698 "rw_mbytes_per_sec": 0, 00:05:49.698 "r_mbytes_per_sec": 0, 00:05:49.698 "w_mbytes_per_sec": 0 00:05:49.698 }, 00:05:49.698 "claimed": false, 00:05:49.698 "zoned": false, 00:05:49.698 "supported_io_types": { 00:05:49.698 "read": true, 00:05:49.698 "write": true, 00:05:49.698 "unmap": true, 00:05:49.698 "flush": true, 00:05:49.698 "reset": true, 00:05:49.698 "nvme_admin": false, 00:05:49.698 "nvme_io": false, 00:05:49.698 "nvme_io_md": false, 00:05:49.698 "write_zeroes": true, 00:05:49.698 "zcopy": true, 00:05:49.698 "get_zone_info": false, 00:05:49.698 "zone_management": false, 00:05:49.698 "zone_append": false, 00:05:49.698 "compare": false, 00:05:49.698 "compare_and_write": false, 00:05:49.698 "abort": true, 00:05:49.698 "seek_hole": false, 00:05:49.698 "seek_data": false, 00:05:49.698 "copy": true, 00:05:49.698 "nvme_iov_md": false 00:05:49.698 }, 00:05:49.698 "memory_domains": [ 00:05:49.698 { 00:05:49.698 "dma_device_id": "system", 00:05:49.698 "dma_device_type": 1 00:05:49.698 }, 00:05:49.698 { 00:05:49.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:49.698 "dma_device_type": 2 00:05:49.698 } 00:05:49.698 ], 00:05:49.698 "driver_specific": { 00:05:49.698 "passthru": { 00:05:49.698 "name": "Passthru0", 00:05:49.698 "base_bdev_name": "Malloc0" 00:05:49.698 } 00:05:49.698 } 00:05:49.698 } 00:05:49.698 ]' 00:05:49.698 13:24:28 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:49.698 13:24:29 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:49.698 00:05:49.698 real 0m0.301s 00:05:49.698 user 0m0.183s 00:05:49.698 sys 0m0.057s 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:49.698 13:24:29 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:49.698 ************************************ 00:05:49.698 END TEST rpc_integrity 00:05:49.698 ************************************ 00:05:49.956 13:24:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:49.956 13:24:29 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:49.956 13:24:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.956 13:24:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.956 13:24:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:49.956 ************************************ 00:05:49.956 START TEST rpc_plugins 00:05:49.956 ************************************ 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:49.956 { 00:05:49.956 "name": "Malloc1", 00:05:49.956 "aliases": [ 00:05:49.956 "000aa9a7-0682-4213-9c51-b4a7662553e8" 00:05:49.956 ], 00:05:49.956 "product_name": "Malloc disk", 00:05:49.956 "block_size": 4096, 00:05:49.956 "num_blocks": 256, 00:05:49.956 "uuid": "000aa9a7-0682-4213-9c51-b4a7662553e8", 00:05:49.956 "assigned_rate_limits": { 00:05:49.956 "rw_ios_per_sec": 0, 00:05:49.956 "rw_mbytes_per_sec": 0, 00:05:49.956 "r_mbytes_per_sec": 0, 00:05:49.956 "w_mbytes_per_sec": 0 00:05:49.956 }, 00:05:49.956 "claimed": false, 00:05:49.956 "zoned": false, 00:05:49.956 "supported_io_types": { 00:05:49.956 "read": true, 00:05:49.956 "write": true, 00:05:49.956 "unmap": true, 00:05:49.956 "flush": true, 00:05:49.956 "reset": true, 00:05:49.956 "nvme_admin": false, 00:05:49.956 "nvme_io": false, 00:05:49.956 "nvme_io_md": false, 00:05:49.956 "write_zeroes": true, 00:05:49.956 "zcopy": true, 00:05:49.956 "get_zone_info": false, 00:05:49.956 "zone_management": false, 00:05:49.956 "zone_append": false, 00:05:49.956 "compare": false, 00:05:49.956 "compare_and_write": false, 00:05:49.956 "abort": true, 00:05:49.956 "seek_hole": false, 00:05:49.956 "seek_data": false, 00:05:49.956 "copy": true, 00:05:49.956 "nvme_iov_md": false 00:05:49.956 }, 00:05:49.956 "memory_domains": [ 00:05:49.956 { 00:05:49.956 "dma_device_id": "system", 00:05:49.956 "dma_device_type": 1 00:05:49.956 }, 00:05:49.956 { 00:05:49.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:49.956 "dma_device_type": 2 00:05:49.956 } 00:05:49.956 ], 00:05:49.956 "driver_specific": {} 00:05:49.956 } 00:05:49.956 ]' 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:49.956 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:49.956 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:49.957 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:49.957 13:24:29 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:49.957 00:05:49.957 real 0m0.149s 00:05:49.957 user 0m0.089s 00:05:49.957 sys 0m0.028s 00:05:49.957 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:49.957 13:24:29 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:49.957 ************************************ 00:05:49.957 END TEST rpc_plugins 00:05:49.957 ************************************ 00:05:49.957 13:24:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:49.957 13:24:29 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:49.957 13:24:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.957 13:24:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.957 13:24:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.215 ************************************ 00:05:50.215 START TEST rpc_trace_cmd_test 00:05:50.215 ************************************ 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.215 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:50.215 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2026677", 00:05:50.215 "tpoint_group_mask": "0x8", 00:05:50.215 "iscsi_conn": { 00:05:50.215 "mask": "0x2", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "scsi": { 00:05:50.215 "mask": "0x4", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "bdev": { 00:05:50.215 "mask": "0x8", 00:05:50.215 "tpoint_mask": "0xffffffffffffffff" 00:05:50.215 }, 00:05:50.215 "nvmf_rdma": { 00:05:50.215 "mask": "0x10", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "nvmf_tcp": { 00:05:50.215 "mask": "0x20", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "ftl": { 00:05:50.215 "mask": "0x40", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "blobfs": { 00:05:50.215 "mask": "0x80", 00:05:50.215 "tpoint_mask": "0x0" 00:05:50.215 }, 00:05:50.215 "dsa": { 00:05:50.216 "mask": "0x200", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "thread": { 00:05:50.216 "mask": "0x400", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "nvme_pcie": { 00:05:50.216 "mask": "0x800", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "iaa": { 00:05:50.216 "mask": "0x1000", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "nvme_tcp": { 00:05:50.216 "mask": "0x2000", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "bdev_nvme": { 00:05:50.216 "mask": "0x4000", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 }, 00:05:50.216 "sock": { 00:05:50.216 "mask": "0x8000", 00:05:50.216 "tpoint_mask": "0x0" 00:05:50.216 } 00:05:50.216 }' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:50.216 00:05:50.216 real 0m0.247s 00:05:50.216 user 0m0.205s 00:05:50.216 sys 0m0.035s 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.216 13:24:29 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:50.216 ************************************ 00:05:50.216 END TEST rpc_trace_cmd_test 00:05:50.216 ************************************ 00:05:50.475 13:24:29 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:50.475 13:24:29 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:50.475 13:24:29 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:50.475 13:24:29 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:50.475 13:24:29 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.475 13:24:29 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.475 13:24:29 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 ************************************ 00:05:50.475 START TEST rpc_daemon_integrity 00:05:50.475 ************************************ 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:50.475 { 00:05:50.475 "name": "Malloc2", 00:05:50.475 "aliases": [ 00:05:50.475 "6e860bd6-aaa0-486b-ba6d-ec0340331917" 00:05:50.475 ], 00:05:50.475 "product_name": "Malloc disk", 00:05:50.475 "block_size": 512, 00:05:50.475 "num_blocks": 16384, 00:05:50.475 "uuid": "6e860bd6-aaa0-486b-ba6d-ec0340331917", 00:05:50.475 "assigned_rate_limits": { 00:05:50.475 "rw_ios_per_sec": 0, 00:05:50.475 "rw_mbytes_per_sec": 0, 00:05:50.475 "r_mbytes_per_sec": 0, 00:05:50.475 "w_mbytes_per_sec": 0 00:05:50.475 }, 00:05:50.475 "claimed": false, 00:05:50.475 "zoned": false, 00:05:50.475 "supported_io_types": { 00:05:50.475 "read": true, 00:05:50.475 "write": true, 00:05:50.475 "unmap": true, 00:05:50.475 "flush": true, 00:05:50.475 "reset": true, 00:05:50.475 "nvme_admin": false, 00:05:50.475 "nvme_io": false, 00:05:50.475 "nvme_io_md": false, 00:05:50.475 "write_zeroes": true, 00:05:50.475 "zcopy": true, 00:05:50.475 "get_zone_info": false, 00:05:50.475 "zone_management": false, 00:05:50.475 "zone_append": false, 00:05:50.475 "compare": false, 00:05:50.475 "compare_and_write": false, 00:05:50.475 "abort": true, 00:05:50.475 "seek_hole": false, 00:05:50.475 "seek_data": false, 00:05:50.475 "copy": true, 00:05:50.475 "nvme_iov_md": false 00:05:50.475 }, 00:05:50.475 "memory_domains": [ 00:05:50.475 { 00:05:50.475 "dma_device_id": "system", 00:05:50.475 "dma_device_type": 1 00:05:50.475 }, 00:05:50.475 { 00:05:50.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:50.475 "dma_device_type": 2 00:05:50.475 } 00:05:50.475 ], 00:05:50.475 "driver_specific": {} 00:05:50.475 } 00:05:50.475 ]' 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 [2024-07-15 13:24:29.855274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:50.475 [2024-07-15 13:24:29.855316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:50.475 [2024-07-15 13:24:29.855339] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b07b20 00:05:50.475 [2024-07-15 13:24:29.855352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:50.475 [2024-07-15 13:24:29.856776] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:50.475 [2024-07-15 13:24:29.856806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:50.475 Passthru0 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.475 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:50.475 { 00:05:50.475 "name": "Malloc2", 00:05:50.475 "aliases": [ 00:05:50.475 "6e860bd6-aaa0-486b-ba6d-ec0340331917" 00:05:50.475 ], 00:05:50.475 "product_name": "Malloc disk", 00:05:50.475 "block_size": 512, 00:05:50.475 "num_blocks": 16384, 00:05:50.475 "uuid": "6e860bd6-aaa0-486b-ba6d-ec0340331917", 00:05:50.475 "assigned_rate_limits": { 00:05:50.475 "rw_ios_per_sec": 0, 00:05:50.475 "rw_mbytes_per_sec": 0, 00:05:50.475 "r_mbytes_per_sec": 0, 00:05:50.475 "w_mbytes_per_sec": 0 00:05:50.475 }, 00:05:50.475 "claimed": true, 00:05:50.475 "claim_type": "exclusive_write", 00:05:50.475 "zoned": false, 00:05:50.475 "supported_io_types": { 00:05:50.475 "read": true, 00:05:50.475 "write": true, 00:05:50.475 "unmap": true, 00:05:50.475 "flush": true, 00:05:50.475 "reset": true, 00:05:50.475 "nvme_admin": false, 00:05:50.475 "nvme_io": false, 00:05:50.475 "nvme_io_md": false, 00:05:50.475 "write_zeroes": true, 00:05:50.475 "zcopy": true, 00:05:50.475 "get_zone_info": false, 00:05:50.475 "zone_management": false, 00:05:50.475 "zone_append": false, 00:05:50.475 "compare": false, 00:05:50.475 "compare_and_write": false, 00:05:50.475 "abort": true, 00:05:50.475 "seek_hole": false, 00:05:50.475 "seek_data": false, 00:05:50.475 "copy": true, 00:05:50.475 "nvme_iov_md": false 00:05:50.475 }, 00:05:50.475 "memory_domains": [ 00:05:50.475 { 00:05:50.475 "dma_device_id": "system", 00:05:50.475 "dma_device_type": 1 00:05:50.475 }, 00:05:50.475 { 00:05:50.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:50.475 "dma_device_type": 2 00:05:50.475 } 00:05:50.475 ], 00:05:50.475 "driver_specific": {} 00:05:50.475 }, 00:05:50.475 { 00:05:50.475 "name": "Passthru0", 00:05:50.475 "aliases": [ 00:05:50.475 "75ad052f-a748-5ee7-86b1-be9017f0f2bc" 00:05:50.475 ], 00:05:50.475 "product_name": "passthru", 00:05:50.475 "block_size": 512, 00:05:50.475 "num_blocks": 16384, 00:05:50.475 "uuid": "75ad052f-a748-5ee7-86b1-be9017f0f2bc", 00:05:50.475 "assigned_rate_limits": { 00:05:50.475 "rw_ios_per_sec": 0, 00:05:50.475 "rw_mbytes_per_sec": 0, 00:05:50.475 "r_mbytes_per_sec": 0, 00:05:50.475 "w_mbytes_per_sec": 0 00:05:50.475 }, 00:05:50.475 "claimed": false, 00:05:50.475 "zoned": false, 00:05:50.475 "supported_io_types": { 00:05:50.475 "read": true, 00:05:50.475 "write": true, 00:05:50.475 "unmap": true, 00:05:50.475 "flush": true, 00:05:50.475 "reset": true, 00:05:50.475 "nvme_admin": false, 00:05:50.475 "nvme_io": false, 00:05:50.475 "nvme_io_md": false, 00:05:50.475 "write_zeroes": true, 00:05:50.475 "zcopy": true, 00:05:50.475 "get_zone_info": false, 00:05:50.475 "zone_management": false, 00:05:50.475 "zone_append": false, 00:05:50.475 "compare": false, 00:05:50.475 "compare_and_write": false, 00:05:50.475 "abort": true, 00:05:50.475 "seek_hole": false, 00:05:50.475 "seek_data": false, 00:05:50.475 "copy": true, 00:05:50.475 "nvme_iov_md": false 00:05:50.475 }, 00:05:50.475 "memory_domains": [ 00:05:50.475 { 00:05:50.475 "dma_device_id": "system", 00:05:50.475 "dma_device_type": 1 00:05:50.475 }, 00:05:50.475 { 00:05:50.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:50.475 "dma_device_type": 2 00:05:50.475 } 00:05:50.475 ], 00:05:50.475 "driver_specific": { 00:05:50.475 "passthru": { 00:05:50.475 "name": "Passthru0", 00:05:50.475 "base_bdev_name": "Malloc2" 00:05:50.475 } 00:05:50.475 } 00:05:50.476 } 00:05:50.476 ]' 00:05:50.476 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:50.735 00:05:50.735 real 0m0.283s 00:05:50.735 user 0m0.181s 00:05:50.735 sys 0m0.049s 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.735 13:24:29 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:50.735 ************************************ 00:05:50.735 END TEST rpc_daemon_integrity 00:05:50.735 ************************************ 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:50.735 13:24:30 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:50.735 13:24:30 rpc -- rpc/rpc.sh@84 -- # killprocess 2026677 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@948 -- # '[' -z 2026677 ']' 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@952 -- # kill -0 2026677 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@953 -- # uname 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2026677 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2026677' 00:05:50.735 killing process with pid 2026677 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@967 -- # kill 2026677 00:05:50.735 13:24:30 rpc -- common/autotest_common.sh@972 -- # wait 2026677 00:05:51.052 00:05:51.052 real 0m2.744s 00:05:51.052 user 0m3.430s 00:05:51.052 sys 0m0.911s 00:05:51.052 13:24:30 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:51.052 13:24:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:51.052 ************************************ 00:05:51.052 END TEST rpc 00:05:51.052 ************************************ 00:05:51.311 13:24:30 -- common/autotest_common.sh@1142 -- # return 0 00:05:51.311 13:24:30 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:51.311 13:24:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:51.311 13:24:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.311 13:24:30 -- common/autotest_common.sh@10 -- # set +x 00:05:51.311 ************************************ 00:05:51.311 START TEST skip_rpc 00:05:51.311 ************************************ 00:05:51.311 13:24:30 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:51.311 * Looking for test storage... 00:05:51.311 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:51.311 13:24:30 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:51.311 13:24:30 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:51.311 13:24:30 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:51.311 13:24:30 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:51.311 13:24:30 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.311 13:24:30 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:51.311 ************************************ 00:05:51.311 START TEST skip_rpc 00:05:51.311 ************************************ 00:05:51.311 13:24:30 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:51.311 13:24:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2027211 00:05:51.311 13:24:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:51.311 13:24:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:51.311 13:24:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:51.570 [2024-07-15 13:24:30.743635] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:05:51.570 [2024-07-15 13:24:30.743698] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2027211 ] 00:05:51.570 [2024-07-15 13:24:30.874458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.570 [2024-07-15 13:24:30.975189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2027211 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2027211 ']' 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2027211 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2027211 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2027211' 00:05:56.842 killing process with pid 2027211 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2027211 00:05:56.842 13:24:35 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2027211 00:05:56.842 00:05:56.842 real 0m5.461s 00:05:56.842 user 0m5.106s 00:05:56.842 sys 0m0.383s 00:05:56.842 13:24:36 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:56.842 13:24:36 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.842 ************************************ 00:05:56.842 END TEST skip_rpc 00:05:56.842 ************************************ 00:05:56.842 13:24:36 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:56.842 13:24:36 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:56.842 13:24:36 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:56.842 13:24:36 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.842 13:24:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.842 ************************************ 00:05:56.842 START TEST skip_rpc_with_json 00:05:56.842 ************************************ 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2027942 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2027942 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2027942 ']' 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:56.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:56.842 13:24:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:57.100 [2024-07-15 13:24:36.288619] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:05:57.100 [2024-07-15 13:24:36.288697] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2027942 ] 00:05:57.100 [2024-07-15 13:24:36.417726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.100 [2024-07-15 13:24:36.519788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:58.062 [2024-07-15 13:24:37.146882] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:58.062 request: 00:05:58.062 { 00:05:58.062 "trtype": "tcp", 00:05:58.062 "method": "nvmf_get_transports", 00:05:58.062 "req_id": 1 00:05:58.062 } 00:05:58.062 Got JSON-RPC error response 00:05:58.062 response: 00:05:58.062 { 00:05:58.062 "code": -19, 00:05:58.062 "message": "No such device" 00:05:58.062 } 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:58.062 [2024-07-15 13:24:37.159029] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:58.062 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:58.062 { 00:05:58.062 "subsystems": [ 00:05:58.062 { 00:05:58.062 "subsystem": "keyring", 00:05:58.062 "config": [] 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "subsystem": "iobuf", 00:05:58.062 "config": [ 00:05:58.062 { 00:05:58.062 "method": "iobuf_set_options", 00:05:58.062 "params": { 00:05:58.062 "small_pool_count": 8192, 00:05:58.062 "large_pool_count": 1024, 00:05:58.062 "small_bufsize": 8192, 00:05:58.062 "large_bufsize": 135168 00:05:58.062 } 00:05:58.062 } 00:05:58.062 ] 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "subsystem": "sock", 00:05:58.062 "config": [ 00:05:58.062 { 00:05:58.062 "method": "sock_set_default_impl", 00:05:58.062 "params": { 00:05:58.062 "impl_name": "posix" 00:05:58.062 } 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "method": "sock_impl_set_options", 00:05:58.062 "params": { 00:05:58.062 "impl_name": "ssl", 00:05:58.062 "recv_buf_size": 4096, 00:05:58.062 "send_buf_size": 4096, 00:05:58.062 "enable_recv_pipe": true, 00:05:58.062 "enable_quickack": false, 00:05:58.062 "enable_placement_id": 0, 00:05:58.062 "enable_zerocopy_send_server": true, 00:05:58.062 "enable_zerocopy_send_client": false, 00:05:58.062 "zerocopy_threshold": 0, 00:05:58.062 "tls_version": 0, 00:05:58.062 "enable_ktls": false 00:05:58.062 } 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "method": "sock_impl_set_options", 00:05:58.062 "params": { 00:05:58.062 "impl_name": "posix", 00:05:58.062 "recv_buf_size": 2097152, 00:05:58.062 "send_buf_size": 2097152, 00:05:58.062 "enable_recv_pipe": true, 00:05:58.062 "enable_quickack": false, 00:05:58.062 "enable_placement_id": 0, 00:05:58.062 "enable_zerocopy_send_server": true, 00:05:58.062 "enable_zerocopy_send_client": false, 00:05:58.062 "zerocopy_threshold": 0, 00:05:58.062 "tls_version": 0, 00:05:58.062 "enable_ktls": false 00:05:58.062 } 00:05:58.062 } 00:05:58.062 ] 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "subsystem": "vmd", 00:05:58.062 "config": [] 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "subsystem": "accel", 00:05:58.062 "config": [ 00:05:58.062 { 00:05:58.062 "method": "accel_set_options", 00:05:58.062 "params": { 00:05:58.062 "small_cache_size": 128, 00:05:58.062 "large_cache_size": 16, 00:05:58.062 "task_count": 2048, 00:05:58.062 "sequence_count": 2048, 00:05:58.062 "buf_count": 2048 00:05:58.062 } 00:05:58.062 } 00:05:58.062 ] 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "subsystem": "bdev", 00:05:58.062 "config": [ 00:05:58.062 { 00:05:58.062 "method": "bdev_set_options", 00:05:58.062 "params": { 00:05:58.062 "bdev_io_pool_size": 65535, 00:05:58.062 "bdev_io_cache_size": 256, 00:05:58.062 "bdev_auto_examine": true, 00:05:58.062 "iobuf_small_cache_size": 128, 00:05:58.062 "iobuf_large_cache_size": 16 00:05:58.062 } 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "method": "bdev_raid_set_options", 00:05:58.062 "params": { 00:05:58.062 "process_window_size_kb": 1024 00:05:58.062 } 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "method": "bdev_iscsi_set_options", 00:05:58.062 "params": { 00:05:58.062 "timeout_sec": 30 00:05:58.062 } 00:05:58.062 }, 00:05:58.062 { 00:05:58.062 "method": "bdev_nvme_set_options", 00:05:58.062 "params": { 00:05:58.062 "action_on_timeout": "none", 00:05:58.062 "timeout_us": 0, 00:05:58.062 "timeout_admin_us": 0, 00:05:58.062 "keep_alive_timeout_ms": 10000, 00:05:58.062 "arbitration_burst": 0, 00:05:58.062 "low_priority_weight": 0, 00:05:58.062 "medium_priority_weight": 0, 00:05:58.062 "high_priority_weight": 0, 00:05:58.062 "nvme_adminq_poll_period_us": 10000, 00:05:58.062 "nvme_ioq_poll_period_us": 0, 00:05:58.062 "io_queue_requests": 0, 00:05:58.062 "delay_cmd_submit": true, 00:05:58.062 "transport_retry_count": 4, 00:05:58.062 "bdev_retry_count": 3, 00:05:58.062 "transport_ack_timeout": 0, 00:05:58.062 "ctrlr_loss_timeout_sec": 0, 00:05:58.062 "reconnect_delay_sec": 0, 00:05:58.062 "fast_io_fail_timeout_sec": 0, 00:05:58.062 "disable_auto_failback": false, 00:05:58.062 "generate_uuids": false, 00:05:58.062 "transport_tos": 0, 00:05:58.062 "nvme_error_stat": false, 00:05:58.062 "rdma_srq_size": 0, 00:05:58.062 "io_path_stat": false, 00:05:58.062 "allow_accel_sequence": false, 00:05:58.062 "rdma_max_cq_size": 0, 00:05:58.062 "rdma_cm_event_timeout_ms": 0, 00:05:58.062 "dhchap_digests": [ 00:05:58.062 "sha256", 00:05:58.062 "sha384", 00:05:58.062 "sha512" 00:05:58.062 ], 00:05:58.062 "dhchap_dhgroups": [ 00:05:58.062 "null", 00:05:58.062 "ffdhe2048", 00:05:58.062 "ffdhe3072", 00:05:58.062 "ffdhe4096", 00:05:58.062 "ffdhe6144", 00:05:58.062 "ffdhe8192" 00:05:58.062 ] 00:05:58.063 } 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "method": "bdev_nvme_set_hotplug", 00:05:58.063 "params": { 00:05:58.063 "period_us": 100000, 00:05:58.063 "enable": false 00:05:58.063 } 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "method": "bdev_wait_for_examine" 00:05:58.063 } 00:05:58.063 ] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "scsi", 00:05:58.063 "config": null 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "scheduler", 00:05:58.063 "config": [ 00:05:58.063 { 00:05:58.063 "method": "framework_set_scheduler", 00:05:58.063 "params": { 00:05:58.063 "name": "static" 00:05:58.063 } 00:05:58.063 } 00:05:58.063 ] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "vhost_scsi", 00:05:58.063 "config": [] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "vhost_blk", 00:05:58.063 "config": [] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "ublk", 00:05:58.063 "config": [] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "nbd", 00:05:58.063 "config": [] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "nvmf", 00:05:58.063 "config": [ 00:05:58.063 { 00:05:58.063 "method": "nvmf_set_config", 00:05:58.063 "params": { 00:05:58.063 "discovery_filter": "match_any", 00:05:58.063 "admin_cmd_passthru": { 00:05:58.063 "identify_ctrlr": false 00:05:58.063 } 00:05:58.063 } 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "method": "nvmf_set_max_subsystems", 00:05:58.063 "params": { 00:05:58.063 "max_subsystems": 1024 00:05:58.063 } 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "method": "nvmf_set_crdt", 00:05:58.063 "params": { 00:05:58.063 "crdt1": 0, 00:05:58.063 "crdt2": 0, 00:05:58.063 "crdt3": 0 00:05:58.063 } 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "method": "nvmf_create_transport", 00:05:58.063 "params": { 00:05:58.063 "trtype": "TCP", 00:05:58.063 "max_queue_depth": 128, 00:05:58.063 "max_io_qpairs_per_ctrlr": 127, 00:05:58.063 "in_capsule_data_size": 4096, 00:05:58.063 "max_io_size": 131072, 00:05:58.063 "io_unit_size": 131072, 00:05:58.063 "max_aq_depth": 128, 00:05:58.063 "num_shared_buffers": 511, 00:05:58.063 "buf_cache_size": 4294967295, 00:05:58.063 "dif_insert_or_strip": false, 00:05:58.063 "zcopy": false, 00:05:58.063 "c2h_success": true, 00:05:58.063 "sock_priority": 0, 00:05:58.063 "abort_timeout_sec": 1, 00:05:58.063 "ack_timeout": 0, 00:05:58.063 "data_wr_pool_size": 0 00:05:58.063 } 00:05:58.063 } 00:05:58.063 ] 00:05:58.063 }, 00:05:58.063 { 00:05:58.063 "subsystem": "iscsi", 00:05:58.063 "config": [ 00:05:58.063 { 00:05:58.063 "method": "iscsi_set_options", 00:05:58.063 "params": { 00:05:58.063 "node_base": "iqn.2016-06.io.spdk", 00:05:58.063 "max_sessions": 128, 00:05:58.063 "max_connections_per_session": 2, 00:05:58.063 "max_queue_depth": 64, 00:05:58.063 "default_time2wait": 2, 00:05:58.063 "default_time2retain": 20, 00:05:58.063 "first_burst_length": 8192, 00:05:58.063 "immediate_data": true, 00:05:58.063 "allow_duplicated_isid": false, 00:05:58.063 "error_recovery_level": 0, 00:05:58.063 "nop_timeout": 60, 00:05:58.063 "nop_in_interval": 30, 00:05:58.063 "disable_chap": false, 00:05:58.063 "require_chap": false, 00:05:58.063 "mutual_chap": false, 00:05:58.063 "chap_group": 0, 00:05:58.063 "max_large_datain_per_connection": 64, 00:05:58.063 "max_r2t_per_connection": 4, 00:05:58.063 "pdu_pool_size": 36864, 00:05:58.063 "immediate_data_pool_size": 16384, 00:05:58.063 "data_out_pool_size": 2048 00:05:58.063 } 00:05:58.063 } 00:05:58.063 ] 00:05:58.063 } 00:05:58.063 ] 00:05:58.063 } 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2027942 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2027942 ']' 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2027942 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2027942 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2027942' 00:05:58.063 killing process with pid 2027942 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2027942 00:05:58.063 13:24:37 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2027942 00:05:58.630 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2028134 00:05:58.630 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:58.630 13:24:37 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2028134 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2028134 ']' 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2028134 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2028134 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2028134' 00:06:03.899 killing process with pid 2028134 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2028134 00:06:03.899 13:24:42 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2028134 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:03.899 00:06:03.899 real 0m7.000s 00:06:03.899 user 0m6.639s 00:06:03.899 sys 0m0.846s 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:03.899 ************************************ 00:06:03.899 END TEST skip_rpc_with_json 00:06:03.899 ************************************ 00:06:03.899 13:24:43 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:03.899 13:24:43 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:03.899 13:24:43 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:03.899 13:24:43 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.899 13:24:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.899 ************************************ 00:06:03.899 START TEST skip_rpc_with_delay 00:06:03.899 ************************************ 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:03.899 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:04.158 [2024-07-15 13:24:43.367859] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:04.158 [2024-07-15 13:24:43.367970] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:04.158 00:06:04.158 real 0m0.096s 00:06:04.158 user 0m0.051s 00:06:04.158 sys 0m0.044s 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:04.158 13:24:43 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:04.158 ************************************ 00:06:04.158 END TEST skip_rpc_with_delay 00:06:04.158 ************************************ 00:06:04.158 13:24:43 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:04.158 13:24:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:04.158 13:24:43 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:04.158 13:24:43 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:04.158 13:24:43 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.158 13:24:43 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.158 13:24:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:04.158 ************************************ 00:06:04.158 START TEST exit_on_failed_rpc_init 00:06:04.158 ************************************ 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2028897 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2028897 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2028897 ']' 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.158 13:24:43 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:04.158 [2024-07-15 13:24:43.530417] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:04.158 [2024-07-15 13:24:43.530482] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2028897 ] 00:06:04.417 [2024-07-15 13:24:43.657681] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.417 [2024-07-15 13:24:43.764191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:05.354 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:05.354 [2024-07-15 13:24:44.530766] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:05.354 [2024-07-15 13:24:44.530832] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2029069 ] 00:06:05.354 [2024-07-15 13:24:44.648388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.354 [2024-07-15 13:24:44.745989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:05.354 [2024-07-15 13:24:44.746070] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:05.354 [2024-07-15 13:24:44.746088] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:05.354 [2024-07-15 13:24:44.746100] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2028897 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2028897 ']' 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2028897 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2028897 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2028897' 00:06:05.614 killing process with pid 2028897 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2028897 00:06:05.614 13:24:44 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2028897 00:06:05.873 00:06:05.873 real 0m1.816s 00:06:05.873 user 0m2.098s 00:06:05.873 sys 0m0.610s 00:06:05.873 13:24:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:05.873 13:24:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:05.873 ************************************ 00:06:05.873 END TEST exit_on_failed_rpc_init 00:06:05.873 ************************************ 00:06:06.132 13:24:45 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:06.132 13:24:45 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:06.132 00:06:06.132 real 0m14.801s 00:06:06.132 user 0m14.031s 00:06:06.132 sys 0m2.204s 00:06:06.132 13:24:45 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:06.132 13:24:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:06.132 ************************************ 00:06:06.132 END TEST skip_rpc 00:06:06.132 ************************************ 00:06:06.132 13:24:45 -- common/autotest_common.sh@1142 -- # return 0 00:06:06.132 13:24:45 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:06.132 13:24:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:06.132 13:24:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:06.132 13:24:45 -- common/autotest_common.sh@10 -- # set +x 00:06:06.132 ************************************ 00:06:06.132 START TEST rpc_client 00:06:06.132 ************************************ 00:06:06.132 13:24:45 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:06.132 * Looking for test storage... 00:06:06.132 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:06.132 13:24:45 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:06.132 OK 00:06:06.132 13:24:45 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:06.132 00:06:06.132 real 0m0.132s 00:06:06.132 user 0m0.055s 00:06:06.132 sys 0m0.087s 00:06:06.132 13:24:45 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:06.132 13:24:45 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:06.132 ************************************ 00:06:06.132 END TEST rpc_client 00:06:06.132 ************************************ 00:06:06.391 13:24:45 -- common/autotest_common.sh@1142 -- # return 0 00:06:06.391 13:24:45 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:06.391 13:24:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:06.391 13:24:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:06.391 13:24:45 -- common/autotest_common.sh@10 -- # set +x 00:06:06.391 ************************************ 00:06:06.391 START TEST json_config 00:06:06.391 ************************************ 00:06:06.391 13:24:45 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:06.391 13:24:45 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:06.391 13:24:45 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:06.391 13:24:45 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:06.391 13:24:45 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:06.391 13:24:45 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:06.391 13:24:45 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:06.392 13:24:45 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:06.392 13:24:45 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:06.392 13:24:45 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:06.392 13:24:45 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.392 13:24:45 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.392 13:24:45 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.392 13:24:45 json_config -- paths/export.sh@5 -- # export PATH 00:06:06.392 13:24:45 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@47 -- # : 0 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:06.392 13:24:45 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:06.392 INFO: JSON configuration test init 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:06.392 13:24:45 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:06.392 13:24:45 json_config -- json_config/common.sh@9 -- # local app=target 00:06:06.392 13:24:45 json_config -- json_config/common.sh@10 -- # shift 00:06:06.392 13:24:45 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:06.392 13:24:45 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:06.392 13:24:45 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:06.392 13:24:45 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:06.392 13:24:45 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:06.392 13:24:45 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2029351 00:06:06.392 13:24:45 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:06.392 Waiting for target to run... 00:06:06.392 13:24:45 json_config -- json_config/common.sh@25 -- # waitforlisten 2029351 /var/tmp/spdk_tgt.sock 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@829 -- # '[' -z 2029351 ']' 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:06.392 13:24:45 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:06.392 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:06.392 13:24:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:06.651 [2024-07-15 13:24:45.819511] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:06.651 [2024-07-15 13:24:45.819585] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2029351 ] 00:06:07.218 [2024-07-15 13:24:46.420985] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.218 [2024-07-15 13:24:46.522691] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.477 13:24:46 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:07.477 13:24:46 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:07.477 13:24:46 json_config -- json_config/common.sh@26 -- # echo '' 00:06:07.477 00:06:07.477 13:24:46 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:07.477 13:24:46 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:07.477 13:24:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:07.477 13:24:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:07.477 13:24:46 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:07.477 13:24:46 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:07.477 13:24:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:07.736 13:24:46 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:07.736 13:24:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:07.994 [2024-07-15 13:24:47.216879] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:07.994 13:24:47 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:07.994 13:24:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:08.253 [2024-07-15 13:24:47.465513] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:08.253 13:24:47 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:08.253 13:24:47 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:08.253 13:24:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:08.253 13:24:47 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:08.253 13:24:47 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:08.253 13:24:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:08.511 [2024-07-15 13:24:47.775010] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:11.047 13:24:50 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:11.047 13:24:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:11.047 13:24:50 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:11.047 13:24:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:11.305 13:24:50 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:11.305 13:24:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:11.305 13:24:50 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:11.305 13:24:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:11.305 13:24:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:11.305 13:24:50 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:11.564 13:24:50 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:11.564 13:24:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:11.823 Nvme0n1p0 Nvme0n1p1 00:06:11.823 13:24:51 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:11.823 13:24:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:12.081 [2024-07-15 13:24:51.370816] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:12.081 [2024-07-15 13:24:51.370874] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:12.081 00:06:12.081 13:24:51 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:12.081 13:24:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:12.352 Malloc3 00:06:12.352 13:24:51 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:12.352 13:24:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:12.647 [2024-07-15 13:24:51.856205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:12.647 [2024-07-15 13:24:51.856256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:12.647 [2024-07-15 13:24:51.856282] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad6a00 00:06:12.647 [2024-07-15 13:24:51.856295] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:12.647 [2024-07-15 13:24:51.857917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:12.647 [2024-07-15 13:24:51.857955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:12.648 PTBdevFromMalloc3 00:06:12.648 13:24:51 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:12.648 13:24:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:12.906 Null0 00:06:12.906 13:24:52 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:12.906 13:24:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:13.165 Malloc0 00:06:13.165 13:24:52 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:13.165 13:24:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:13.424 Malloc1 00:06:13.424 13:24:52 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:13.424 13:24:52 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:13.682 102400+0 records in 00:06:13.682 102400+0 records out 00:06:13.682 104857600 bytes (105 MB, 100 MiB) copied, 0.308163 s, 340 MB/s 00:06:13.682 13:24:52 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:13.682 13:24:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:13.942 aio_disk 00:06:13.942 13:24:53 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:13.942 13:24:53 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:13.942 13:24:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:19.221 80fd4572-079d-4a80-8e35-bd777e591335 00:06:19.221 13:24:57 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:19.221 13:24:57 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:19.221 13:24:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:19.221 13:24:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:19.221 13:24:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:19.221 13:24:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:19.221 13:24:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:19.221 13:24:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:19.221 13:24:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:19.479 13:24:58 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:19.479 13:24:58 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:19.479 13:24:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:19.737 MallocForCryptoBdev 00:06:19.737 13:24:59 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:19.737 13:24:59 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:19.737 13:24:59 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:19.737 13:24:59 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:19.737 13:24:59 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:19.737 13:24:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:19.994 [2024-07-15 13:24:59.345207] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:19.994 CryptoMallocBdev 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:19.994 13:24:59 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@71 -- # sort 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@72 -- # sort 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:19.995 13:24:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:19.995 13:24:59 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\a\0\d\4\4\8\9\-\0\5\d\5\-\4\9\1\a\-\a\6\8\f\-\1\8\7\2\0\d\2\f\b\3\9\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\4\3\6\6\9\0\b\-\3\0\e\6\-\4\e\d\c\-\b\b\7\e\-\4\0\3\2\6\1\d\f\a\1\2\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\e\1\d\9\5\b\2\-\4\f\0\5\-\4\0\f\6\-\b\8\1\1\-\9\8\1\7\2\0\6\9\2\6\b\f\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\8\4\2\3\d\7\7\-\8\a\b\8\-\4\f\9\f\-\a\9\1\b\-\3\8\b\6\0\8\6\e\3\9\b\9\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@86 -- # cat 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:20.252 Expected events matched: 00:06:20.252 bdev_register:2a0d4489-05d5-491a-a68f-18720d2fb391 00:06:20.252 bdev_register:4436690b-30e6-4edc-bb7e-403261dfa123 00:06:20.252 bdev_register:aio_disk 00:06:20.252 bdev_register:CryptoMallocBdev 00:06:20.252 bdev_register:de1d95b2-4f05-40f6-b811-9817206926bf 00:06:20.252 bdev_register:e8423d77-8ab8-4f9f-a91b-38b6086e39b9 00:06:20.252 bdev_register:Malloc0 00:06:20.252 bdev_register:Malloc0p0 00:06:20.252 bdev_register:Malloc0p1 00:06:20.252 bdev_register:Malloc0p2 00:06:20.252 bdev_register:Malloc1 00:06:20.252 bdev_register:Malloc3 00:06:20.252 bdev_register:MallocForCryptoBdev 00:06:20.252 bdev_register:Null0 00:06:20.252 bdev_register:Nvme0n1 00:06:20.252 bdev_register:Nvme0n1p0 00:06:20.252 bdev_register:Nvme0n1p1 00:06:20.252 bdev_register:PTBdevFromMalloc3 00:06:20.252 13:24:59 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:20.252 13:24:59 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:20.252 13:24:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:20.511 13:24:59 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:20.511 13:24:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:20.511 13:24:59 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:20.511 13:24:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:21.077 MallocBdevForConfigChangeCheck 00:06:21.077 13:25:00 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:21.077 13:25:00 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:21.077 13:25:00 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:21.077 13:25:00 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:21.077 13:25:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:21.337 13:25:00 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:21.337 INFO: shutting down applications... 00:06:21.337 13:25:00 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:21.337 13:25:00 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:21.337 13:25:00 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:21.337 13:25:00 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:21.595 [2024-07-15 13:25:00.837873] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:24.880 Calling clear_iscsi_subsystem 00:06:24.880 Calling clear_nvmf_subsystem 00:06:24.880 Calling clear_nbd_subsystem 00:06:24.880 Calling clear_ublk_subsystem 00:06:24.880 Calling clear_vhost_blk_subsystem 00:06:24.880 Calling clear_vhost_scsi_subsystem 00:06:24.880 Calling clear_bdev_subsystem 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:24.880 13:25:03 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:24.880 13:25:04 json_config -- json_config/json_config.sh@345 -- # break 00:06:24.880 13:25:04 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:24.880 13:25:04 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:24.880 13:25:04 json_config -- json_config/common.sh@31 -- # local app=target 00:06:24.880 13:25:04 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:24.880 13:25:04 json_config -- json_config/common.sh@35 -- # [[ -n 2029351 ]] 00:06:24.880 13:25:04 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2029351 00:06:24.880 13:25:04 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:24.880 13:25:04 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:24.880 13:25:04 json_config -- json_config/common.sh@41 -- # kill -0 2029351 00:06:24.880 13:25:04 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:25.449 13:25:04 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:25.449 13:25:04 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:25.449 13:25:04 json_config -- json_config/common.sh@41 -- # kill -0 2029351 00:06:25.449 13:25:04 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:25.449 13:25:04 json_config -- json_config/common.sh@43 -- # break 00:06:25.449 13:25:04 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:25.449 13:25:04 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:25.449 SPDK target shutdown done 00:06:25.449 13:25:04 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:25.449 INFO: relaunching applications... 00:06:25.449 13:25:04 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:25.449 13:25:04 json_config -- json_config/common.sh@9 -- # local app=target 00:06:25.449 13:25:04 json_config -- json_config/common.sh@10 -- # shift 00:06:25.449 13:25:04 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:25.449 13:25:04 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:25.449 13:25:04 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:25.449 13:25:04 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:25.449 13:25:04 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:25.449 13:25:04 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2031968 00:06:25.449 13:25:04 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:25.449 13:25:04 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:25.449 Waiting for target to run... 00:06:25.449 13:25:04 json_config -- json_config/common.sh@25 -- # waitforlisten 2031968 /var/tmp/spdk_tgt.sock 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@829 -- # '[' -z 2031968 ']' 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:25.449 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.449 13:25:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:25.449 [2024-07-15 13:25:04.747172] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:25.449 [2024-07-15 13:25:04.747255] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2031968 ] 00:06:26.018 [2024-07-15 13:25:05.326437] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.018 [2024-07-15 13:25:05.433481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.277 [2024-07-15 13:25:05.487616] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:26.277 [2024-07-15 13:25:05.495653] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:26.277 [2024-07-15 13:25:05.503670] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:26.277 [2024-07-15 13:25:05.584897] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:28.813 [2024-07-15 13:25:07.793366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:28.813 [2024-07-15 13:25:07.793433] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:28.813 [2024-07-15 13:25:07.793448] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:28.813 [2024-07-15 13:25:07.801387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:28.813 [2024-07-15 13:25:07.801413] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:28.813 [2024-07-15 13:25:07.809400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:28.813 [2024-07-15 13:25:07.809424] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:28.813 [2024-07-15 13:25:07.817434] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:28.813 [2024-07-15 13:25:07.817461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:28.813 [2024-07-15 13:25:07.817474] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:28.813 [2024-07-15 13:25:08.190172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:28.813 [2024-07-15 13:25:08.190219] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:28.813 [2024-07-15 13:25:08.190237] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1464b90 00:06:28.813 [2024-07-15 13:25:08.190249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:28.813 [2024-07-15 13:25:08.190546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:28.813 [2024-07-15 13:25:08.190565] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:29.072 13:25:08 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.072 13:25:08 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:29.072 13:25:08 json_config -- json_config/common.sh@26 -- # echo '' 00:06:29.072 00:06:29.072 13:25:08 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:29.072 13:25:08 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:29.072 INFO: Checking if target configuration is the same... 00:06:29.072 13:25:08 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:29.072 13:25:08 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:29.072 13:25:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:29.072 + '[' 2 -ne 2 ']' 00:06:29.072 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:29.072 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:29.072 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:29.072 +++ basename /dev/fd/62 00:06:29.072 ++ mktemp /tmp/62.XXX 00:06:29.072 + tmp_file_1=/tmp/62.U7g 00:06:29.072 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:29.072 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:29.072 + tmp_file_2=/tmp/spdk_tgt_config.json.xVP 00:06:29.072 + ret=0 00:06:29.072 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:29.331 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:29.331 + diff -u /tmp/62.U7g /tmp/spdk_tgt_config.json.xVP 00:06:29.331 + echo 'INFO: JSON config files are the same' 00:06:29.331 INFO: JSON config files are the same 00:06:29.331 + rm /tmp/62.U7g /tmp/spdk_tgt_config.json.xVP 00:06:29.331 + exit 0 00:06:29.331 13:25:08 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:29.331 13:25:08 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:29.332 INFO: changing configuration and checking if this can be detected... 00:06:29.332 13:25:08 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:29.332 13:25:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:29.590 13:25:08 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:29.590 13:25:08 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:29.590 13:25:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:29.590 + '[' 2 -ne 2 ']' 00:06:29.590 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:29.590 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:29.590 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:29.590 +++ basename /dev/fd/62 00:06:29.590 ++ mktemp /tmp/62.XXX 00:06:29.590 + tmp_file_1=/tmp/62.vub 00:06:29.590 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:29.590 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:29.590 + tmp_file_2=/tmp/spdk_tgt_config.json.Mpy 00:06:29.590 + ret=0 00:06:29.590 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:30.158 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:30.158 + diff -u /tmp/62.vub /tmp/spdk_tgt_config.json.Mpy 00:06:30.158 + ret=1 00:06:30.158 + echo '=== Start of file: /tmp/62.vub ===' 00:06:30.158 + cat /tmp/62.vub 00:06:30.158 + echo '=== End of file: /tmp/62.vub ===' 00:06:30.158 + echo '' 00:06:30.158 + echo '=== Start of file: /tmp/spdk_tgt_config.json.Mpy ===' 00:06:30.158 + cat /tmp/spdk_tgt_config.json.Mpy 00:06:30.158 + echo '=== End of file: /tmp/spdk_tgt_config.json.Mpy ===' 00:06:30.158 + echo '' 00:06:30.158 + rm /tmp/62.vub /tmp/spdk_tgt_config.json.Mpy 00:06:30.158 + exit 1 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:30.158 INFO: configuration change detected. 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:30.158 13:25:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:30.158 13:25:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@317 -- # [[ -n 2031968 ]] 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:30.158 13:25:09 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:30.158 13:25:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:30.158 13:25:09 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:30.158 13:25:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:30.418 13:25:09 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:30.418 13:25:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:30.418 13:25:09 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:30.418 13:25:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:30.678 13:25:09 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:30.678 13:25:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:30.938 13:25:10 json_config -- json_config/json_config.sh@323 -- # killprocess 2031968 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@948 -- # '[' -z 2031968 ']' 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@952 -- # kill -0 2031968 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@953 -- # uname 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2031968 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2031968' 00:06:30.938 killing process with pid 2031968 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@967 -- # kill 2031968 00:06:30.938 13:25:10 json_config -- common/autotest_common.sh@972 -- # wait 2031968 00:06:34.252 13:25:13 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:34.252 13:25:13 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:34.252 13:25:13 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:34.252 13:25:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.252 13:25:13 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:34.252 13:25:13 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:34.252 INFO: Success 00:06:34.252 00:06:34.252 real 0m27.902s 00:06:34.252 user 0m33.634s 00:06:34.252 sys 0m4.066s 00:06:34.252 13:25:13 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.252 13:25:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.252 ************************************ 00:06:34.252 END TEST json_config 00:06:34.252 ************************************ 00:06:34.252 13:25:13 -- common/autotest_common.sh@1142 -- # return 0 00:06:34.252 13:25:13 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:34.252 13:25:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.252 13:25:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.252 13:25:13 -- common/autotest_common.sh@10 -- # set +x 00:06:34.252 ************************************ 00:06:34.252 START TEST json_config_extra_key 00:06:34.252 ************************************ 00:06:34.252 13:25:13 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:34.513 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:34.513 13:25:13 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:34.513 13:25:13 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:34.513 13:25:13 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:34.513 13:25:13 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:34.513 13:25:13 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.513 13:25:13 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.514 13:25:13 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.514 13:25:13 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:34.514 13:25:13 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:34.514 13:25:13 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:34.514 INFO: launching applications... 00:06:34.514 13:25:13 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2033317 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:34.514 Waiting for target to run... 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2033317 /var/tmp/spdk_tgt.sock 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2033317 ']' 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:34.514 13:25:13 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:34.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.514 13:25:13 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:34.514 [2024-07-15 13:25:13.795314] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:34.514 [2024-07-15 13:25:13.795392] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2033317 ] 00:06:35.083 [2024-07-15 13:25:14.428143] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.341 [2024-07-15 13:25:14.529163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.341 13:25:14 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.341 13:25:14 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:35.341 00:06:35.341 13:25:14 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:35.341 INFO: shutting down applications... 00:06:35.341 13:25:14 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2033317 ]] 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2033317 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2033317 00:06:35.341 13:25:14 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2033317 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:35.908 13:25:15 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:35.908 SPDK target shutdown done 00:06:35.908 13:25:15 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:35.908 Success 00:06:35.908 00:06:35.908 real 0m1.618s 00:06:35.908 user 0m1.042s 00:06:35.908 sys 0m0.768s 00:06:35.908 13:25:15 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.908 13:25:15 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:35.908 ************************************ 00:06:35.908 END TEST json_config_extra_key 00:06:35.908 ************************************ 00:06:35.908 13:25:15 -- common/autotest_common.sh@1142 -- # return 0 00:06:35.908 13:25:15 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:35.908 13:25:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.908 13:25:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.908 13:25:15 -- common/autotest_common.sh@10 -- # set +x 00:06:35.908 ************************************ 00:06:35.908 START TEST alias_rpc 00:06:35.908 ************************************ 00:06:35.908 13:25:15 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:36.168 * Looking for test storage... 00:06:36.168 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:36.168 13:25:15 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:36.168 13:25:15 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2033548 00:06:36.168 13:25:15 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2033548 00:06:36.168 13:25:15 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2033548 ']' 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.168 13:25:15 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.168 [2024-07-15 13:25:15.476944] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:36.168 [2024-07-15 13:25:15.477020] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2033548 ] 00:06:36.427 [2024-07-15 13:25:15.607320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.427 [2024-07-15 13:25:15.707262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.996 13:25:16 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:36.996 13:25:16 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:36.996 13:25:16 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:37.255 13:25:16 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2033548 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2033548 ']' 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2033548 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2033548 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2033548' 00:06:37.255 killing process with pid 2033548 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@967 -- # kill 2033548 00:06:37.255 13:25:16 alias_rpc -- common/autotest_common.sh@972 -- # wait 2033548 00:06:37.514 00:06:37.514 real 0m1.592s 00:06:37.514 user 0m1.567s 00:06:37.514 sys 0m0.566s 00:06:37.514 13:25:16 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.514 13:25:16 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:37.514 ************************************ 00:06:37.514 END TEST alias_rpc 00:06:37.514 ************************************ 00:06:37.773 13:25:16 -- common/autotest_common.sh@1142 -- # return 0 00:06:37.774 13:25:16 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:37.774 13:25:16 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:37.774 13:25:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:37.774 13:25:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.774 13:25:16 -- common/autotest_common.sh@10 -- # set +x 00:06:37.774 ************************************ 00:06:37.774 START TEST spdkcli_tcp 00:06:37.774 ************************************ 00:06:37.774 13:25:16 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:37.774 * Looking for test storage... 00:06:37.774 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2033932 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2033932 00:06:37.774 13:25:17 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2033932 ']' 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:37.774 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:37.774 13:25:17 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:37.774 [2024-07-15 13:25:17.168856] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:37.774 [2024-07-15 13:25:17.168924] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2033932 ] 00:06:38.033 [2024-07-15 13:25:17.283275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:38.033 [2024-07-15 13:25:17.390645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:38.033 [2024-07-15 13:25:17.390651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.972 13:25:18 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:38.972 13:25:18 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:38.972 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2033960 00:06:38.972 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:38.972 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:38.972 [ 00:06:38.972 "bdev_malloc_delete", 00:06:38.972 "bdev_malloc_create", 00:06:38.972 "bdev_null_resize", 00:06:38.972 "bdev_null_delete", 00:06:38.972 "bdev_null_create", 00:06:38.972 "bdev_nvme_cuse_unregister", 00:06:38.972 "bdev_nvme_cuse_register", 00:06:38.972 "bdev_opal_new_user", 00:06:38.972 "bdev_opal_set_lock_state", 00:06:38.972 "bdev_opal_delete", 00:06:38.972 "bdev_opal_get_info", 00:06:38.972 "bdev_opal_create", 00:06:38.972 "bdev_nvme_opal_revert", 00:06:38.972 "bdev_nvme_opal_init", 00:06:38.972 "bdev_nvme_send_cmd", 00:06:38.972 "bdev_nvme_get_path_iostat", 00:06:38.972 "bdev_nvme_get_mdns_discovery_info", 00:06:38.972 "bdev_nvme_stop_mdns_discovery", 00:06:38.972 "bdev_nvme_start_mdns_discovery", 00:06:38.972 "bdev_nvme_set_multipath_policy", 00:06:38.972 "bdev_nvme_set_preferred_path", 00:06:38.972 "bdev_nvme_get_io_paths", 00:06:38.972 "bdev_nvme_remove_error_injection", 00:06:38.972 "bdev_nvme_add_error_injection", 00:06:38.972 "bdev_nvme_get_discovery_info", 00:06:38.972 "bdev_nvme_stop_discovery", 00:06:38.972 "bdev_nvme_start_discovery", 00:06:38.972 "bdev_nvme_get_controller_health_info", 00:06:38.972 "bdev_nvme_disable_controller", 00:06:38.972 "bdev_nvme_enable_controller", 00:06:38.972 "bdev_nvme_reset_controller", 00:06:38.972 "bdev_nvme_get_transport_statistics", 00:06:38.972 "bdev_nvme_apply_firmware", 00:06:38.972 "bdev_nvme_detach_controller", 00:06:38.972 "bdev_nvme_get_controllers", 00:06:38.972 "bdev_nvme_attach_controller", 00:06:38.972 "bdev_nvme_set_hotplug", 00:06:38.972 "bdev_nvme_set_options", 00:06:38.972 "bdev_passthru_delete", 00:06:38.972 "bdev_passthru_create", 00:06:38.972 "bdev_lvol_set_parent_bdev", 00:06:38.972 "bdev_lvol_set_parent", 00:06:38.972 "bdev_lvol_check_shallow_copy", 00:06:38.972 "bdev_lvol_start_shallow_copy", 00:06:38.972 "bdev_lvol_grow_lvstore", 00:06:38.972 "bdev_lvol_get_lvols", 00:06:38.972 "bdev_lvol_get_lvstores", 00:06:38.972 "bdev_lvol_delete", 00:06:38.972 "bdev_lvol_set_read_only", 00:06:38.972 "bdev_lvol_resize", 00:06:38.972 "bdev_lvol_decouple_parent", 00:06:38.972 "bdev_lvol_inflate", 00:06:38.972 "bdev_lvol_rename", 00:06:38.972 "bdev_lvol_clone_bdev", 00:06:38.972 "bdev_lvol_clone", 00:06:38.972 "bdev_lvol_snapshot", 00:06:38.972 "bdev_lvol_create", 00:06:38.972 "bdev_lvol_delete_lvstore", 00:06:38.972 "bdev_lvol_rename_lvstore", 00:06:38.972 "bdev_lvol_create_lvstore", 00:06:38.972 "bdev_raid_set_options", 00:06:38.972 "bdev_raid_remove_base_bdev", 00:06:38.972 "bdev_raid_add_base_bdev", 00:06:38.972 "bdev_raid_delete", 00:06:38.972 "bdev_raid_create", 00:06:38.972 "bdev_raid_get_bdevs", 00:06:38.972 "bdev_error_inject_error", 00:06:38.972 "bdev_error_delete", 00:06:38.972 "bdev_error_create", 00:06:38.972 "bdev_split_delete", 00:06:38.972 "bdev_split_create", 00:06:38.972 "bdev_delay_delete", 00:06:38.972 "bdev_delay_create", 00:06:38.972 "bdev_delay_update_latency", 00:06:38.972 "bdev_zone_block_delete", 00:06:38.972 "bdev_zone_block_create", 00:06:38.972 "blobfs_create", 00:06:38.972 "blobfs_detect", 00:06:38.972 "blobfs_set_cache_size", 00:06:38.972 "bdev_crypto_delete", 00:06:38.972 "bdev_crypto_create", 00:06:38.972 "bdev_compress_delete", 00:06:38.972 "bdev_compress_create", 00:06:38.972 "bdev_compress_get_orphans", 00:06:38.972 "bdev_aio_delete", 00:06:38.972 "bdev_aio_rescan", 00:06:38.972 "bdev_aio_create", 00:06:38.972 "bdev_ftl_set_property", 00:06:38.972 "bdev_ftl_get_properties", 00:06:38.972 "bdev_ftl_get_stats", 00:06:38.972 "bdev_ftl_unmap", 00:06:38.972 "bdev_ftl_unload", 00:06:38.972 "bdev_ftl_delete", 00:06:38.972 "bdev_ftl_load", 00:06:38.972 "bdev_ftl_create", 00:06:38.972 "bdev_virtio_attach_controller", 00:06:38.972 "bdev_virtio_scsi_get_devices", 00:06:38.972 "bdev_virtio_detach_controller", 00:06:38.972 "bdev_virtio_blk_set_hotplug", 00:06:38.972 "bdev_iscsi_delete", 00:06:38.972 "bdev_iscsi_create", 00:06:38.972 "bdev_iscsi_set_options", 00:06:38.972 "accel_error_inject_error", 00:06:38.972 "ioat_scan_accel_module", 00:06:38.972 "dsa_scan_accel_module", 00:06:38.972 "iaa_scan_accel_module", 00:06:38.972 "dpdk_cryptodev_get_driver", 00:06:38.972 "dpdk_cryptodev_set_driver", 00:06:38.972 "dpdk_cryptodev_scan_accel_module", 00:06:38.972 "compressdev_scan_accel_module", 00:06:38.972 "keyring_file_remove_key", 00:06:38.972 "keyring_file_add_key", 00:06:38.972 "keyring_linux_set_options", 00:06:38.972 "iscsi_get_histogram", 00:06:38.972 "iscsi_enable_histogram", 00:06:38.972 "iscsi_set_options", 00:06:38.972 "iscsi_get_auth_groups", 00:06:38.972 "iscsi_auth_group_remove_secret", 00:06:38.972 "iscsi_auth_group_add_secret", 00:06:38.972 "iscsi_delete_auth_group", 00:06:38.972 "iscsi_create_auth_group", 00:06:38.972 "iscsi_set_discovery_auth", 00:06:38.972 "iscsi_get_options", 00:06:38.972 "iscsi_target_node_request_logout", 00:06:38.972 "iscsi_target_node_set_redirect", 00:06:38.972 "iscsi_target_node_set_auth", 00:06:38.972 "iscsi_target_node_add_lun", 00:06:38.972 "iscsi_get_stats", 00:06:38.972 "iscsi_get_connections", 00:06:38.972 "iscsi_portal_group_set_auth", 00:06:38.972 "iscsi_start_portal_group", 00:06:38.972 "iscsi_delete_portal_group", 00:06:38.972 "iscsi_create_portal_group", 00:06:38.972 "iscsi_get_portal_groups", 00:06:38.972 "iscsi_delete_target_node", 00:06:38.972 "iscsi_target_node_remove_pg_ig_maps", 00:06:38.972 "iscsi_target_node_add_pg_ig_maps", 00:06:38.972 "iscsi_create_target_node", 00:06:38.972 "iscsi_get_target_nodes", 00:06:38.972 "iscsi_delete_initiator_group", 00:06:38.972 "iscsi_initiator_group_remove_initiators", 00:06:38.972 "iscsi_initiator_group_add_initiators", 00:06:38.972 "iscsi_create_initiator_group", 00:06:38.972 "iscsi_get_initiator_groups", 00:06:38.972 "nvmf_set_crdt", 00:06:38.972 "nvmf_set_config", 00:06:38.972 "nvmf_set_max_subsystems", 00:06:38.972 "nvmf_stop_mdns_prr", 00:06:38.972 "nvmf_publish_mdns_prr", 00:06:38.972 "nvmf_subsystem_get_listeners", 00:06:38.972 "nvmf_subsystem_get_qpairs", 00:06:38.972 "nvmf_subsystem_get_controllers", 00:06:38.972 "nvmf_get_stats", 00:06:38.972 "nvmf_get_transports", 00:06:38.972 "nvmf_create_transport", 00:06:38.972 "nvmf_get_targets", 00:06:38.972 "nvmf_delete_target", 00:06:38.972 "nvmf_create_target", 00:06:38.972 "nvmf_subsystem_allow_any_host", 00:06:38.972 "nvmf_subsystem_remove_host", 00:06:38.972 "nvmf_subsystem_add_host", 00:06:38.972 "nvmf_ns_remove_host", 00:06:38.972 "nvmf_ns_add_host", 00:06:38.972 "nvmf_subsystem_remove_ns", 00:06:38.972 "nvmf_subsystem_add_ns", 00:06:38.972 "nvmf_subsystem_listener_set_ana_state", 00:06:38.972 "nvmf_discovery_get_referrals", 00:06:38.972 "nvmf_discovery_remove_referral", 00:06:38.972 "nvmf_discovery_add_referral", 00:06:38.972 "nvmf_subsystem_remove_listener", 00:06:38.972 "nvmf_subsystem_add_listener", 00:06:38.972 "nvmf_delete_subsystem", 00:06:38.972 "nvmf_create_subsystem", 00:06:38.972 "nvmf_get_subsystems", 00:06:38.972 "env_dpdk_get_mem_stats", 00:06:38.972 "nbd_get_disks", 00:06:38.972 "nbd_stop_disk", 00:06:38.972 "nbd_start_disk", 00:06:38.972 "ublk_recover_disk", 00:06:38.972 "ublk_get_disks", 00:06:38.972 "ublk_stop_disk", 00:06:38.972 "ublk_start_disk", 00:06:38.972 "ublk_destroy_target", 00:06:38.972 "ublk_create_target", 00:06:38.972 "virtio_blk_create_transport", 00:06:38.972 "virtio_blk_get_transports", 00:06:38.972 "vhost_controller_set_coalescing", 00:06:38.972 "vhost_get_controllers", 00:06:38.972 "vhost_delete_controller", 00:06:38.972 "vhost_create_blk_controller", 00:06:38.972 "vhost_scsi_controller_remove_target", 00:06:38.972 "vhost_scsi_controller_add_target", 00:06:38.972 "vhost_start_scsi_controller", 00:06:38.972 "vhost_create_scsi_controller", 00:06:38.972 "thread_set_cpumask", 00:06:38.972 "framework_get_governor", 00:06:38.972 "framework_get_scheduler", 00:06:38.972 "framework_set_scheduler", 00:06:38.972 "framework_get_reactors", 00:06:38.972 "thread_get_io_channels", 00:06:38.972 "thread_get_pollers", 00:06:38.972 "thread_get_stats", 00:06:38.972 "framework_monitor_context_switch", 00:06:38.972 "spdk_kill_instance", 00:06:38.972 "log_enable_timestamps", 00:06:38.972 "log_get_flags", 00:06:38.972 "log_clear_flag", 00:06:38.972 "log_set_flag", 00:06:38.972 "log_get_level", 00:06:38.972 "log_set_level", 00:06:38.972 "log_get_print_level", 00:06:38.972 "log_set_print_level", 00:06:38.972 "framework_enable_cpumask_locks", 00:06:38.972 "framework_disable_cpumask_locks", 00:06:38.972 "framework_wait_init", 00:06:38.972 "framework_start_init", 00:06:38.972 "scsi_get_devices", 00:06:38.972 "bdev_get_histogram", 00:06:38.972 "bdev_enable_histogram", 00:06:38.972 "bdev_set_qos_limit", 00:06:38.972 "bdev_set_qd_sampling_period", 00:06:38.972 "bdev_get_bdevs", 00:06:38.972 "bdev_reset_iostat", 00:06:38.972 "bdev_get_iostat", 00:06:38.972 "bdev_examine", 00:06:38.973 "bdev_wait_for_examine", 00:06:38.973 "bdev_set_options", 00:06:38.973 "notify_get_notifications", 00:06:38.973 "notify_get_types", 00:06:38.973 "accel_get_stats", 00:06:38.973 "accel_set_options", 00:06:38.973 "accel_set_driver", 00:06:38.973 "accel_crypto_key_destroy", 00:06:38.973 "accel_crypto_keys_get", 00:06:38.973 "accel_crypto_key_create", 00:06:38.973 "accel_assign_opc", 00:06:38.973 "accel_get_module_info", 00:06:38.973 "accel_get_opc_assignments", 00:06:38.973 "vmd_rescan", 00:06:38.973 "vmd_remove_device", 00:06:38.973 "vmd_enable", 00:06:38.973 "sock_get_default_impl", 00:06:38.973 "sock_set_default_impl", 00:06:38.973 "sock_impl_set_options", 00:06:38.973 "sock_impl_get_options", 00:06:38.973 "iobuf_get_stats", 00:06:38.973 "iobuf_set_options", 00:06:38.973 "framework_get_pci_devices", 00:06:38.973 "framework_get_config", 00:06:38.973 "framework_get_subsystems", 00:06:38.973 "trace_get_info", 00:06:38.973 "trace_get_tpoint_group_mask", 00:06:38.973 "trace_disable_tpoint_group", 00:06:38.973 "trace_enable_tpoint_group", 00:06:38.973 "trace_clear_tpoint_mask", 00:06:38.973 "trace_set_tpoint_mask", 00:06:38.973 "keyring_get_keys", 00:06:38.973 "spdk_get_version", 00:06:38.973 "rpc_get_methods" 00:06:38.973 ] 00:06:38.973 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:38.973 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:38.973 13:25:18 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2033932 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2033932 ']' 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2033932 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2033932 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2033932' 00:06:38.973 killing process with pid 2033932 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2033932 00:06:38.973 13:25:18 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2033932 00:06:39.539 00:06:39.539 real 0m1.768s 00:06:39.539 user 0m3.162s 00:06:39.539 sys 0m0.591s 00:06:39.539 13:25:18 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:39.539 13:25:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:39.539 ************************************ 00:06:39.539 END TEST spdkcli_tcp 00:06:39.539 ************************************ 00:06:39.539 13:25:18 -- common/autotest_common.sh@1142 -- # return 0 00:06:39.539 13:25:18 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:39.539 13:25:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:39.539 13:25:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.539 13:25:18 -- common/autotest_common.sh@10 -- # set +x 00:06:39.539 ************************************ 00:06:39.539 START TEST dpdk_mem_utility 00:06:39.539 ************************************ 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:39.539 * Looking for test storage... 00:06:39.539 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:39.539 13:25:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:39.539 13:25:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2034191 00:06:39.539 13:25:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2034191 00:06:39.539 13:25:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2034191 ']' 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:39.539 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:39.539 13:25:18 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:39.797 [2024-07-15 13:25:18.975900] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:39.797 [2024-07-15 13:25:18.975984] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034191 ] 00:06:39.797 [2024-07-15 13:25:19.087273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.797 [2024-07-15 13:25:19.186721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.362 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:40.362 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:40.362 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:40.362 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:40.362 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:40.362 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:40.362 { 00:06:40.362 "filename": "/tmp/spdk_mem_dump.txt" 00:06:40.362 } 00:06:40.624 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:40.624 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:40.624 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:40.624 2 heaps totaling size 816.000000 MiB 00:06:40.624 size: 814.000000 MiB heap id: 0 00:06:40.624 size: 2.000000 MiB heap id: 1 00:06:40.624 end heaps---------- 00:06:40.624 8 mempools totaling size 598.116089 MiB 00:06:40.624 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:40.624 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:40.624 size: 84.521057 MiB name: bdev_io_2034191 00:06:40.624 size: 51.011292 MiB name: evtpool_2034191 00:06:40.624 size: 50.003479 MiB name: msgpool_2034191 00:06:40.624 size: 21.763794 MiB name: PDU_Pool 00:06:40.624 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:40.624 size: 0.026123 MiB name: Session_Pool 00:06:40.624 end mempools------- 00:06:40.624 201 memzones totaling size 4.176453 MiB 00:06:40.624 size: 1.000366 MiB name: RG_ring_0_2034191 00:06:40.624 size: 1.000366 MiB name: RG_ring_1_2034191 00:06:40.624 size: 1.000366 MiB name: RG_ring_4_2034191 00:06:40.624 size: 1.000366 MiB name: RG_ring_5_2034191 00:06:40.624 size: 0.125366 MiB name: RG_ring_2_2034191 00:06:40.624 size: 0.015991 MiB name: RG_ring_3_2034191 00:06:40.624 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:40.624 size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:40.624 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:40.624 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:40.624 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:40.625 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:40.625 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:40.625 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:40.625 end memzones------- 00:06:40.625 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:40.625 heap id: 0 total size: 814.000000 MiB number of busy elements: 518 number of free elements: 14 00:06:40.625 list of free elements. size: 11.815002 MiB 00:06:40.625 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:40.625 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:40.625 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:40.625 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:40.625 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:40.625 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:40.625 element at address: 0x200007000000 with size: 0.960022 MiB 00:06:40.625 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:40.625 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:06:40.625 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:40.625 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:40.625 element at address: 0x200000800000 with size: 0.486694 MiB 00:06:40.625 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:40.625 element at address: 0x200027e00000 with size: 0.403076 MiB 00:06:40.625 list of standard malloc elements. size: 199.876709 MiB 00:06:40.625 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:40.625 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:40.625 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:40.625 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:40.625 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:40.625 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:40.625 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:40.625 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:40.625 element at address: 0x200000330b40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000337640 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000033e140 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000344c40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000034b740 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000352240 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000358d40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000035f840 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:40.625 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:40.625 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:40.625 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:06:40.625 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000333040 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000335540 with size: 0.004028 MiB 00:06:40.625 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000339b40 with size: 0.004028 MiB 00:06:40.625 element at address: 0x20000033c040 with size: 0.004028 MiB 00:06:40.625 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000340640 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000342b40 with size: 0.004028 MiB 00:06:40.625 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000347140 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000349640 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000350140 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000354740 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000356c40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000035b240 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000035d740 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:40.626 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:40.626 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:40.626 element at address: 0x200000204d40 with size: 0.000305 MiB 00:06:40.626 element at address: 0x200000200000 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200180 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200240 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200300 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200480 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200540 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200600 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200780 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200840 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200900 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200a80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200b40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200c00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200d80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200e40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200f00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201080 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201140 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201200 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201380 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201440 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201500 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201680 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201740 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201800 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201980 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201a40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201b00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201c80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201d40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201e00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000201f80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202040 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202100 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202280 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202340 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202400 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202580 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202640 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202700 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202880 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202940 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202a00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202b80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202c40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202d00 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202e80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000202f40 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203000 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203180 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203240 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203300 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203480 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203540 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203600 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203780 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203840 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203900 with size: 0.000183 MiB 00:06:40.626 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203a80 with size: 0.000183 MiB 00:06:40.626 element at address: 0x200000203b40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203c00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203d80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203e40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203f00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204080 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204140 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204200 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204380 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204440 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204500 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204680 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204740 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204800 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204980 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204a40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204b00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204c80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204e80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000204f40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205000 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205180 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205240 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205300 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205480 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205540 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205600 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205780 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205840 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205900 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205a80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205b40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205c00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205d80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205e40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205f00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000206080 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000206140 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000206200 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000020a780 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022af80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b040 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b100 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b280 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b340 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b400 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b580 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b640 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b700 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b900 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022be40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c080 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c140 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c200 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c380 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c440 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000022c500 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000032e700 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000331d40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000338840 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000033f340 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000345e40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000034c940 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000353440 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000359f40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000360a40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:40.627 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:40.627 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e67300 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e673c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6dfc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:40.628 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:40.628 list of memzone associated elements. size: 602.308289 MiB 00:06:40.628 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:40.628 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:40.628 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:40.628 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:40.628 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:40.628 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2034191_0 00:06:40.628 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:40.628 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2034191_0 00:06:40.628 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:40.629 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2034191_0 00:06:40.629 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:40.629 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:40.629 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:40.629 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:40.629 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:40.629 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2034191 00:06:40.629 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:40.629 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2034191 00:06:40.629 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:06:40.629 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2034191 00:06:40.629 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:40.629 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:40.629 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:40.629 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:40.629 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:40.629 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:40.629 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:40.629 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:40.629 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:40.629 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2034191 00:06:40.629 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:40.629 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2034191 00:06:40.629 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:40.629 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2034191 00:06:40.629 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:40.629 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2034191 00:06:40.629 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:40.629 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2034191 00:06:40.629 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:40.629 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:40.629 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:40.629 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:40.629 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:40.629 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:40.629 element at address: 0x20000020a840 with size: 0.125488 MiB 00:06:40.629 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2034191 00:06:40.629 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:06:40.629 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:40.629 element at address: 0x200027e67480 with size: 0.023743 MiB 00:06:40.629 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:40.629 element at address: 0x200000206580 with size: 0.016113 MiB 00:06:40.629 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2034191 00:06:40.629 element at address: 0x200027e6d5c0 with size: 0.002441 MiB 00:06:40.629 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:40.629 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:40.629 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:40.629 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:40.629 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:40.629 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:40.629 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:40.629 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:40.629 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:40.629 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:40.629 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:40.629 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:40.629 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:40.629 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:40.629 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:40.629 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:40.629 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:40.629 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:40.629 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:40.629 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:40.629 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:40.629 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:40.629 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:40.629 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:40.629 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:40.629 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:40.629 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:40.629 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:40.629 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:40.629 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:40.629 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:40.629 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:40.629 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:40.629 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:40.629 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:40.629 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:40.629 element at address: 0x20000035d580 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:40.629 element at address: 0x20000035a000 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:40.629 element at address: 0x200000356a80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:40.629 element at address: 0x200000353500 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:40.629 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:40.629 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:40.629 element at address: 0x200000349480 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:40.629 element at address: 0x200000345f00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:40.629 element at address: 0x200000342980 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:40.629 element at address: 0x20000033f400 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:40.629 element at address: 0x20000033be80 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:40.629 element at address: 0x200000338900 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:40.629 element at address: 0x200000335380 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:40.629 element at address: 0x200000331e00 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:40.629 element at address: 0x20000032e880 with size: 0.000427 MiB 00:06:40.629 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:40.629 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:40.629 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:40.629 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:06:40.629 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2034191 00:06:40.629 element at address: 0x200000206380 with size: 0.000305 MiB 00:06:40.629 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2034191 00:06:40.629 element at address: 0x200027e6e080 with size: 0.000305 MiB 00:06:40.629 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:40.629 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:40.629 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:40.629 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:40.629 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:40.630 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:40.630 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:40.630 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:40.630 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:40.630 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:40.630 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:40.630 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:40.630 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:40.630 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:40.630 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:40.630 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:40.630 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:40.630 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:40.630 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:40.630 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:40.630 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:40.630 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:40.630 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:40.630 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:40.630 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:40.630 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:40.630 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:40.630 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:40.630 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:40.630 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:40.630 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:40.630 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:40.630 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:40.630 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:40.630 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:40.630 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:40.630 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:40.630 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:40.630 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:40.630 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:40.630 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:40.630 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:40.630 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:40.630 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:40.630 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:40.630 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:40.630 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:40.630 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:40.630 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:40.630 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:40.630 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:40.630 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:40.630 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:40.630 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:40.630 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:40.630 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:40.630 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:40.630 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:40.630 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:40.630 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:40.630 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:40.630 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:40.630 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:40.630 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:40.630 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:40.630 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:40.630 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:40.630 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:40.630 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:40.630 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:40.630 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:40.630 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:40.630 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:40.630 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:40.630 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:40.630 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:40.630 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:40.630 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:40.630 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:40.630 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:40.630 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:40.630 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:40.630 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:40.631 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:40.631 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:40.631 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:40.631 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:40.631 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:40.631 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:40.631 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:40.631 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:40.631 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:40.631 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:40.631 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:40.631 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:40.631 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:40.631 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:40.631 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:40.631 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:40.631 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:40.631 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:40.631 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:40.631 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:40.631 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:40.631 13:25:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2034191 00:06:40.631 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2034191 ']' 00:06:40.631 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2034191 00:06:40.631 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:40.631 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:40.631 13:25:19 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2034191 00:06:40.631 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:40.631 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:40.631 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2034191' 00:06:40.631 killing process with pid 2034191 00:06:40.631 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2034191 00:06:40.631 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2034191 00:06:41.197 00:06:41.197 real 0m1.571s 00:06:41.197 user 0m1.618s 00:06:41.197 sys 0m0.518s 00:06:41.197 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:41.197 13:25:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:41.197 ************************************ 00:06:41.197 END TEST dpdk_mem_utility 00:06:41.197 ************************************ 00:06:41.197 13:25:20 -- common/autotest_common.sh@1142 -- # return 0 00:06:41.197 13:25:20 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:41.197 13:25:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:41.198 13:25:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.198 13:25:20 -- common/autotest_common.sh@10 -- # set +x 00:06:41.198 ************************************ 00:06:41.198 START TEST event 00:06:41.198 ************************************ 00:06:41.198 13:25:20 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:41.198 * Looking for test storage... 00:06:41.198 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:41.198 13:25:20 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:41.198 13:25:20 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:41.198 13:25:20 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:41.198 13:25:20 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:41.198 13:25:20 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.198 13:25:20 event -- common/autotest_common.sh@10 -- # set +x 00:06:41.455 ************************************ 00:06:41.455 START TEST event_perf 00:06:41.455 ************************************ 00:06:41.455 13:25:20 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:41.455 Running I/O for 1 seconds...[2024-07-15 13:25:20.651733] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:41.455 [2024-07-15 13:25:20.651801] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034432 ] 00:06:41.455 [2024-07-15 13:25:20.780783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:41.712 [2024-07-15 13:25:20.887341] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:41.712 [2024-07-15 13:25:20.887427] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:41.712 [2024-07-15 13:25:20.887508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:41.712 [2024-07-15 13:25:20.887512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.647 Running I/O for 1 seconds... 00:06:42.647 lcore 0: 176465 00:06:42.647 lcore 1: 176465 00:06:42.647 lcore 2: 176464 00:06:42.647 lcore 3: 176466 00:06:42.647 done. 00:06:42.647 00:06:42.647 real 0m1.355s 00:06:42.647 user 0m4.204s 00:06:42.647 sys 0m0.145s 00:06:42.647 13:25:21 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:42.647 13:25:21 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:42.647 ************************************ 00:06:42.647 END TEST event_perf 00:06:42.647 ************************************ 00:06:42.647 13:25:22 event -- common/autotest_common.sh@1142 -- # return 0 00:06:42.647 13:25:22 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:42.647 13:25:22 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:42.647 13:25:22 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.647 13:25:22 event -- common/autotest_common.sh@10 -- # set +x 00:06:42.647 ************************************ 00:06:42.647 START TEST event_reactor 00:06:42.647 ************************************ 00:06:42.647 13:25:22 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:42.904 [2024-07-15 13:25:22.098410] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:42.904 [2024-07-15 13:25:22.098474] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034628 ] 00:06:42.904 [2024-07-15 13:25:22.232023] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.162 [2024-07-15 13:25:22.334339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.096 test_start 00:06:44.096 oneshot 00:06:44.096 tick 100 00:06:44.096 tick 100 00:06:44.096 tick 250 00:06:44.096 tick 100 00:06:44.096 tick 100 00:06:44.096 tick 250 00:06:44.096 tick 100 00:06:44.096 tick 500 00:06:44.096 tick 100 00:06:44.096 tick 100 00:06:44.096 tick 250 00:06:44.096 tick 100 00:06:44.096 tick 100 00:06:44.096 test_end 00:06:44.096 00:06:44.096 real 0m1.357s 00:06:44.096 user 0m1.210s 00:06:44.096 sys 0m0.141s 00:06:44.096 13:25:23 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:44.097 13:25:23 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:44.097 ************************************ 00:06:44.097 END TEST event_reactor 00:06:44.097 ************************************ 00:06:44.097 13:25:23 event -- common/autotest_common.sh@1142 -- # return 0 00:06:44.097 13:25:23 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:44.097 13:25:23 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:44.097 13:25:23 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.097 13:25:23 event -- common/autotest_common.sh@10 -- # set +x 00:06:44.097 ************************************ 00:06:44.097 START TEST event_reactor_perf 00:06:44.097 ************************************ 00:06:44.097 13:25:23 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:44.356 [2024-07-15 13:25:23.537463] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:44.356 [2024-07-15 13:25:23.537533] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2034832 ] 00:06:44.356 [2024-07-15 13:25:23.666005] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.356 [2024-07-15 13:25:23.765380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.743 test_start 00:06:45.743 test_end 00:06:45.743 Performance: 328709 events per second 00:06:45.743 00:06:45.743 real 0m1.341s 00:06:45.743 user 0m1.199s 00:06:45.743 sys 0m0.136s 00:06:45.743 13:25:24 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:45.743 13:25:24 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:45.743 ************************************ 00:06:45.743 END TEST event_reactor_perf 00:06:45.743 ************************************ 00:06:45.743 13:25:24 event -- common/autotest_common.sh@1142 -- # return 0 00:06:45.743 13:25:24 event -- event/event.sh@49 -- # uname -s 00:06:45.743 13:25:24 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:45.743 13:25:24 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:45.743 13:25:24 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:45.743 13:25:24 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.743 13:25:24 event -- common/autotest_common.sh@10 -- # set +x 00:06:45.743 ************************************ 00:06:45.743 START TEST event_scheduler 00:06:45.743 ************************************ 00:06:45.743 13:25:24 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:45.743 * Looking for test storage... 00:06:45.743 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:45.743 13:25:25 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:45.743 13:25:25 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2035123 00:06:45.743 13:25:25 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:45.743 13:25:25 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:45.743 13:25:25 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2035123 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2035123 ']' 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.743 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:45.743 13:25:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:45.743 [2024-07-15 13:25:25.095298] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:45.743 [2024-07-15 13:25:25.095372] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2035123 ] 00:06:46.003 [2024-07-15 13:25:25.194397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:46.003 [2024-07-15 13:25:25.280802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.003 [2024-07-15 13:25:25.280956] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:46.003 [2024-07-15 13:25:25.280959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:46.003 [2024-07-15 13:25:25.280880] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:46.940 13:25:26 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 [2024-07-15 13:25:26.047790] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:46.940 [2024-07-15 13:25:26.047810] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:06:46.940 [2024-07-15 13:25:26.047822] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:46.940 [2024-07-15 13:25:26.047833] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:46.940 [2024-07-15 13:25:26.047841] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 [2024-07-15 13:25:26.135423] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 ************************************ 00:06:46.940 START TEST scheduler_create_thread 00:06:46.940 ************************************ 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 2 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 3 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 4 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 5 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 6 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 7 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 8 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 9 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 10 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:46.940 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:47.507 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:47.507 13:25:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:47.507 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:47.507 13:25:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:48.980 13:25:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:48.980 13:25:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:48.980 13:25:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:48.980 13:25:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:48.980 13:25:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:49.917 13:25:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:49.917 00:06:49.917 real 0m3.102s 00:06:49.917 user 0m0.025s 00:06:49.917 sys 0m0.006s 00:06:49.917 13:25:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.917 13:25:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:49.917 ************************************ 00:06:49.917 END TEST scheduler_create_thread 00:06:49.917 ************************************ 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:06:49.917 13:25:29 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:49.917 13:25:29 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2035123 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2035123 ']' 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2035123 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:49.917 13:25:29 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2035123 00:06:50.176 13:25:29 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:50.176 13:25:29 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:50.176 13:25:29 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2035123' 00:06:50.176 killing process with pid 2035123 00:06:50.176 13:25:29 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2035123 00:06:50.176 13:25:29 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2035123 00:06:50.436 [2024-07-15 13:25:29.658791] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:50.696 00:06:50.696 real 0m4.949s 00:06:50.696 user 0m9.775s 00:06:50.696 sys 0m0.486s 00:06:50.696 13:25:29 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.696 13:25:29 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:50.696 ************************************ 00:06:50.696 END TEST event_scheduler 00:06:50.696 ************************************ 00:06:50.696 13:25:29 event -- common/autotest_common.sh@1142 -- # return 0 00:06:50.696 13:25:29 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:50.696 13:25:29 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:50.696 13:25:29 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.696 13:25:29 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.696 13:25:29 event -- common/autotest_common.sh@10 -- # set +x 00:06:50.696 ************************************ 00:06:50.696 START TEST app_repeat 00:06:50.696 ************************************ 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2035804 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2035804' 00:06:50.696 Process app_repeat pid: 2035804 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:50.696 spdk_app_start Round 0 00:06:50.696 13:25:29 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2035804 /var/tmp/spdk-nbd.sock 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2035804 ']' 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:50.696 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:50.696 13:25:29 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:50.696 [2024-07-15 13:25:30.025179] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:06:50.696 [2024-07-15 13:25:30.025245] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2035804 ] 00:06:50.955 [2024-07-15 13:25:30.156579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:50.955 [2024-07-15 13:25:30.257773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.955 [2024-07-15 13:25:30.257778] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.521 13:25:30 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:51.521 13:25:30 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:51.521 13:25:30 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:51.779 Malloc0 00:06:51.779 13:25:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:52.037 Malloc1 00:06:52.037 13:25:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:52.037 13:25:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:52.295 /dev/nbd0 00:06:52.295 13:25:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:52.295 13:25:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:52.295 1+0 records in 00:06:52.295 1+0 records out 00:06:52.295 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240058 s, 17.1 MB/s 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:52.295 13:25:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:52.296 13:25:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:52.296 13:25:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:52.296 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:52.296 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:52.296 13:25:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:52.554 /dev/nbd1 00:06:52.554 13:25:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:52.554 13:25:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:52.554 13:25:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:52.554 1+0 records in 00:06:52.554 1+0 records out 00:06:52.554 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247834 s, 16.5 MB/s 00:06:52.812 13:25:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:52.812 13:25:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:52.812 13:25:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:52.812 13:25:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:52.812 13:25:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:52.812 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:52.812 13:25:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:52.812 13:25:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:52.812 13:25:31 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:52.812 13:25:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:53.071 { 00:06:53.071 "nbd_device": "/dev/nbd0", 00:06:53.071 "bdev_name": "Malloc0" 00:06:53.071 }, 00:06:53.071 { 00:06:53.071 "nbd_device": "/dev/nbd1", 00:06:53.071 "bdev_name": "Malloc1" 00:06:53.071 } 00:06:53.071 ]' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:53.071 { 00:06:53.071 "nbd_device": "/dev/nbd0", 00:06:53.071 "bdev_name": "Malloc0" 00:06:53.071 }, 00:06:53.071 { 00:06:53.071 "nbd_device": "/dev/nbd1", 00:06:53.071 "bdev_name": "Malloc1" 00:06:53.071 } 00:06:53.071 ]' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:53.071 /dev/nbd1' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:53.071 /dev/nbd1' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:53.071 256+0 records in 00:06:53.071 256+0 records out 00:06:53.071 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106846 s, 98.1 MB/s 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:53.071 256+0 records in 00:06:53.071 256+0 records out 00:06:53.071 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0255093 s, 41.1 MB/s 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:53.071 256+0 records in 00:06:53.071 256+0 records out 00:06:53.071 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212867 s, 49.3 MB/s 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:53.071 13:25:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:53.329 13:25:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:53.588 13:25:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:53.847 13:25:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:53.847 13:25:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:54.106 13:25:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:54.365 [2024-07-15 13:25:33.742996] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:54.624 [2024-07-15 13:25:33.842462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:54.624 [2024-07-15 13:25:33.842466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.624 [2024-07-15 13:25:33.894685] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:54.624 [2024-07-15 13:25:33.894738] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:57.158 13:25:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:57.158 13:25:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:57.158 spdk_app_start Round 1 00:06:57.158 13:25:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2035804 /var/tmp/spdk-nbd.sock 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2035804 ']' 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:57.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:57.158 13:25:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:57.417 13:25:36 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:57.417 13:25:36 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:57.417 13:25:36 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:57.677 Malloc0 00:06:57.677 13:25:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:57.936 Malloc1 00:06:57.936 13:25:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:57.936 13:25:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:58.195 /dev/nbd0 00:06:58.195 13:25:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:58.195 13:25:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:58.195 1+0 records in 00:06:58.195 1+0 records out 00:06:58.195 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000210658 s, 19.4 MB/s 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:58.195 13:25:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:58.195 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:58.195 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:58.195 13:25:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:58.454 /dev/nbd1 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:58.454 1+0 records in 00:06:58.454 1+0 records out 00:06:58.454 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023201 s, 17.7 MB/s 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:58.454 13:25:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.454 13:25:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:58.713 { 00:06:58.713 "nbd_device": "/dev/nbd0", 00:06:58.713 "bdev_name": "Malloc0" 00:06:58.713 }, 00:06:58.713 { 00:06:58.713 "nbd_device": "/dev/nbd1", 00:06:58.713 "bdev_name": "Malloc1" 00:06:58.713 } 00:06:58.713 ]' 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:58.713 { 00:06:58.713 "nbd_device": "/dev/nbd0", 00:06:58.713 "bdev_name": "Malloc0" 00:06:58.713 }, 00:06:58.713 { 00:06:58.713 "nbd_device": "/dev/nbd1", 00:06:58.713 "bdev_name": "Malloc1" 00:06:58.713 } 00:06:58.713 ]' 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:58.713 /dev/nbd1' 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:58.713 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:58.713 /dev/nbd1' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:58.973 256+0 records in 00:06:58.973 256+0 records out 00:06:58.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104286 s, 101 MB/s 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:58.973 256+0 records in 00:06:58.973 256+0 records out 00:06:58.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0187983 s, 55.8 MB/s 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:58.973 256+0 records in 00:06:58.973 256+0 records out 00:06:58.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021754 s, 48.2 MB/s 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:58.973 13:25:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:59.232 13:25:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:59.491 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:59.750 13:25:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:59.750 13:25:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:00.011 13:25:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:00.271 [2024-07-15 13:25:39.459466] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:00.271 [2024-07-15 13:25:39.558997] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.271 [2024-07-15 13:25:39.559001] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.271 [2024-07-15 13:25:39.612328] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:00.271 [2024-07-15 13:25:39.612382] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:02.803 13:25:42 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:02.803 13:25:42 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:02.804 spdk_app_start Round 2 00:07:02.804 13:25:42 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2035804 /var/tmp/spdk-nbd.sock 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2035804 ']' 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:02.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:02.804 13:25:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:03.062 13:25:42 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:03.062 13:25:42 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:03.062 13:25:42 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:03.322 Malloc0 00:07:03.322 13:25:42 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:03.581 Malloc1 00:07:03.581 13:25:42 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.581 13:25:42 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:03.839 /dev/nbd0 00:07:03.839 13:25:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:03.839 13:25:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:03.839 1+0 records in 00:07:03.839 1+0 records out 00:07:03.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209395 s, 19.6 MB/s 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:03.839 13:25:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:03.839 13:25:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:03.839 13:25:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:03.839 13:25:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:04.096 /dev/nbd1 00:07:04.096 13:25:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:04.096 13:25:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:04.096 13:25:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:04.096 13:25:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:04.096 13:25:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:04.096 13:25:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:04.096 13:25:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:04.353 1+0 records in 00:07:04.353 1+0 records out 00:07:04.353 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000215322 s, 19.0 MB/s 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:04.353 13:25:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:04.353 { 00:07:04.353 "nbd_device": "/dev/nbd0", 00:07:04.353 "bdev_name": "Malloc0" 00:07:04.353 }, 00:07:04.353 { 00:07:04.353 "nbd_device": "/dev/nbd1", 00:07:04.353 "bdev_name": "Malloc1" 00:07:04.353 } 00:07:04.353 ]' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:04.353 { 00:07:04.353 "nbd_device": "/dev/nbd0", 00:07:04.353 "bdev_name": "Malloc0" 00:07:04.353 }, 00:07:04.353 { 00:07:04.353 "nbd_device": "/dev/nbd1", 00:07:04.353 "bdev_name": "Malloc1" 00:07:04.353 } 00:07:04.353 ]' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:04.353 /dev/nbd1' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:04.353 /dev/nbd1' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:04.353 13:25:43 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:04.611 256+0 records in 00:07:04.611 256+0 records out 00:07:04.611 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109563 s, 95.7 MB/s 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:04.611 256+0 records in 00:07:04.611 256+0 records out 00:07:04.611 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.017877 s, 58.7 MB/s 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:04.611 256+0 records in 00:07:04.611 256+0 records out 00:07:04.611 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020706 s, 50.6 MB/s 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:04.611 13:25:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:04.612 13:25:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:04.901 13:25:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:05.185 13:25:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:05.442 13:25:44 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:05.442 13:25:44 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:05.700 13:25:44 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:05.962 [2024-07-15 13:25:45.145976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:05.962 [2024-07-15 13:25:45.244148] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:05.962 [2024-07-15 13:25:45.244153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.962 [2024-07-15 13:25:45.296434] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:05.962 [2024-07-15 13:25:45.296489] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:08.494 13:25:47 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2035804 /var/tmp/spdk-nbd.sock 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2035804 ']' 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:08.494 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:08.494 13:25:47 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:08.753 13:25:48 event.app_repeat -- event/event.sh@39 -- # killprocess 2035804 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2035804 ']' 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2035804 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:08.753 13:25:48 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2035804 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2035804' 00:07:09.012 killing process with pid 2035804 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2035804 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2035804 00:07:09.012 spdk_app_start is called in Round 0. 00:07:09.012 Shutdown signal received, stop current app iteration 00:07:09.012 Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 reinitialization... 00:07:09.012 spdk_app_start is called in Round 1. 00:07:09.012 Shutdown signal received, stop current app iteration 00:07:09.012 Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 reinitialization... 00:07:09.012 spdk_app_start is called in Round 2. 00:07:09.012 Shutdown signal received, stop current app iteration 00:07:09.012 Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 reinitialization... 00:07:09.012 spdk_app_start is called in Round 3. 00:07:09.012 Shutdown signal received, stop current app iteration 00:07:09.012 13:25:48 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:09.012 13:25:48 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:09.012 00:07:09.012 real 0m18.445s 00:07:09.012 user 0m39.734s 00:07:09.012 sys 0m3.786s 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.012 13:25:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:09.012 ************************************ 00:07:09.012 END TEST app_repeat 00:07:09.012 ************************************ 00:07:09.271 13:25:48 event -- common/autotest_common.sh@1142 -- # return 0 00:07:09.271 13:25:48 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:09.271 00:07:09.271 real 0m27.982s 00:07:09.271 user 0m56.325s 00:07:09.271 sys 0m5.068s 00:07:09.271 13:25:48 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.271 13:25:48 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.271 ************************************ 00:07:09.271 END TEST event 00:07:09.271 ************************************ 00:07:09.271 13:25:48 -- common/autotest_common.sh@1142 -- # return 0 00:07:09.271 13:25:48 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:09.271 13:25:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:09.271 13:25:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.271 13:25:48 -- common/autotest_common.sh@10 -- # set +x 00:07:09.271 ************************************ 00:07:09.271 START TEST thread 00:07:09.271 ************************************ 00:07:09.271 13:25:48 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:09.271 * Looking for test storage... 00:07:09.271 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:09.271 13:25:48 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:09.271 13:25:48 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:09.271 13:25:48 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.271 13:25:48 thread -- common/autotest_common.sh@10 -- # set +x 00:07:09.271 ************************************ 00:07:09.271 START TEST thread_poller_perf 00:07:09.271 ************************************ 00:07:09.271 13:25:48 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:09.530 [2024-07-15 13:25:48.720990] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:09.530 [2024-07-15 13:25:48.721061] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038496 ] 00:07:09.530 [2024-07-15 13:25:48.850044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.530 [2024-07-15 13:25:48.947843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.530 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:10.904 ====================================== 00:07:10.904 busy:2309574506 (cyc) 00:07:10.904 total_run_count: 266000 00:07:10.904 tsc_hz: 2300000000 (cyc) 00:07:10.904 ====================================== 00:07:10.904 poller_cost: 8682 (cyc), 3774 (nsec) 00:07:10.904 00:07:10.904 real 0m1.356s 00:07:10.904 user 0m1.217s 00:07:10.904 sys 0m0.133s 00:07:10.904 13:25:50 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.904 13:25:50 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:10.904 ************************************ 00:07:10.904 END TEST thread_poller_perf 00:07:10.904 ************************************ 00:07:10.904 13:25:50 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:10.904 13:25:50 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:10.904 13:25:50 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:10.904 13:25:50 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.904 13:25:50 thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.904 ************************************ 00:07:10.904 START TEST thread_poller_perf 00:07:10.904 ************************************ 00:07:10.904 13:25:50 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:10.904 [2024-07-15 13:25:50.158637] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:10.904 [2024-07-15 13:25:50.158695] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038700 ] 00:07:10.904 [2024-07-15 13:25:50.271872] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.163 [2024-07-15 13:25:50.374426] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.163 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:12.099 ====================================== 00:07:12.099 busy:2302801728 (cyc) 00:07:12.099 total_run_count: 3494000 00:07:12.099 tsc_hz: 2300000000 (cyc) 00:07:12.099 ====================================== 00:07:12.099 poller_cost: 659 (cyc), 286 (nsec) 00:07:12.099 00:07:12.099 real 0m1.334s 00:07:12.099 user 0m1.203s 00:07:12.099 sys 0m0.124s 00:07:12.099 13:25:51 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.099 13:25:51 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:12.099 ************************************ 00:07:12.099 END TEST thread_poller_perf 00:07:12.099 ************************************ 00:07:12.099 13:25:51 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:12.099 13:25:51 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:12.099 00:07:12.099 real 0m2.954s 00:07:12.099 user 0m2.518s 00:07:12.099 sys 0m0.446s 00:07:12.099 13:25:51 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.099 13:25:51 thread -- common/autotest_common.sh@10 -- # set +x 00:07:12.099 ************************************ 00:07:12.099 END TEST thread 00:07:12.099 ************************************ 00:07:12.358 13:25:51 -- common/autotest_common.sh@1142 -- # return 0 00:07:12.358 13:25:51 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:12.358 13:25:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.358 13:25:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.358 13:25:51 -- common/autotest_common.sh@10 -- # set +x 00:07:12.358 ************************************ 00:07:12.358 START TEST accel 00:07:12.358 ************************************ 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:12.358 * Looking for test storage... 00:07:12.358 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:12.358 13:25:51 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:12.358 13:25:51 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:12.358 13:25:51 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:12.358 13:25:51 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2038932 00:07:12.358 13:25:51 accel -- accel/accel.sh@63 -- # waitforlisten 2038932 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@829 -- # '[' -z 2038932 ']' 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.358 13:25:51 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.358 13:25:51 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:12.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.358 13:25:51 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.358 13:25:51 accel -- common/autotest_common.sh@10 -- # set +x 00:07:12.358 13:25:51 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.358 13:25:51 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.358 13:25:51 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.358 13:25:51 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.358 13:25:51 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:12.358 13:25:51 accel -- accel/accel.sh@41 -- # jq -r . 00:07:12.358 [2024-07-15 13:25:51.753958] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:12.358 [2024-07-15 13:25:51.754037] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2038932 ] 00:07:12.618 [2024-07-15 13:25:51.885800] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.618 [2024-07-15 13:25:51.992905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.554 13:25:52 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:13.554 13:25:52 accel -- common/autotest_common.sh@862 -- # return 0 00:07:13.554 13:25:52 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:13.554 13:25:52 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:13.554 13:25:52 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:13.554 13:25:52 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:13.554 13:25:52 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:13.554 13:25:52 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:13.554 13:25:52 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.554 13:25:52 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:13.554 13:25:52 accel -- common/autotest_common.sh@10 -- # set +x 00:07:13.554 13:25:52 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.554 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.554 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.554 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.555 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.555 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.555 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.555 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.555 13:25:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # IFS== 00:07:13.555 13:25:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:13.555 13:25:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:13.555 13:25:52 accel -- accel/accel.sh@75 -- # killprocess 2038932 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@948 -- # '[' -z 2038932 ']' 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@952 -- # kill -0 2038932 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@953 -- # uname 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2038932 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2038932' 00:07:13.555 killing process with pid 2038932 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@967 -- # kill 2038932 00:07:13.555 13:25:52 accel -- common/autotest_common.sh@972 -- # wait 2038932 00:07:13.813 13:25:53 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:13.813 13:25:53 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:13.813 13:25:53 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:13.813 13:25:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.813 13:25:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:13.813 13:25:53 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:13.813 13:25:53 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:13.814 13:25:53 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:13.814 13:25:53 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:14.071 13:25:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:14.071 13:25:53 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:14.071 13:25:53 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:14.071 13:25:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.071 13:25:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.071 ************************************ 00:07:14.071 START TEST accel_missing_filename 00:07:14.071 ************************************ 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.071 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:14.071 13:25:53 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:14.071 [2024-07-15 13:25:53.351583] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:14.071 [2024-07-15 13:25:53.351644] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039207 ] 00:07:14.071 [2024-07-15 13:25:53.479797] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.331 [2024-07-15 13:25:53.582187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.331 [2024-07-15 13:25:53.655888] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:14.331 [2024-07-15 13:25:53.729816] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:14.590 A filename is required. 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:14.590 00:07:14.590 real 0m0.511s 00:07:14.590 user 0m0.345s 00:07:14.590 sys 0m0.189s 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.590 13:25:53 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:14.590 ************************************ 00:07:14.590 END TEST accel_missing_filename 00:07:14.590 ************************************ 00:07:14.590 13:25:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:14.590 13:25:53 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:14.590 13:25:53 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:14.590 13:25:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.590 13:25:53 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.590 ************************************ 00:07:14.590 START TEST accel_compress_verify 00:07:14.590 ************************************ 00:07:14.590 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:14.590 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:14.590 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:14.590 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:14.590 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.591 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:14.591 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.591 13:25:53 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:14.591 13:25:53 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:14.591 [2024-07-15 13:25:53.942422] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:14.591 [2024-07-15 13:25:53.942482] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039341 ] 00:07:14.851 [2024-07-15 13:25:54.071501] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.851 [2024-07-15 13:25:54.168967] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.851 [2024-07-15 13:25:54.232843] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:15.110 [2024-07-15 13:25:54.305008] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:07:15.110 00:07:15.110 Compression does not support the verify option, aborting. 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:15.110 00:07:15.110 real 0m0.495s 00:07:15.110 user 0m0.342s 00:07:15.110 sys 0m0.184s 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.110 13:25:54 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:15.110 ************************************ 00:07:15.110 END TEST accel_compress_verify 00:07:15.111 ************************************ 00:07:15.111 13:25:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:15.111 13:25:54 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:15.111 13:25:54 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:15.111 13:25:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.111 13:25:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.111 ************************************ 00:07:15.111 START TEST accel_wrong_workload 00:07:15.111 ************************************ 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:15.111 13:25:54 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:15.111 Unsupported workload type: foobar 00:07:15.111 [2024-07-15 13:25:54.504295] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:15.111 accel_perf options: 00:07:15.111 [-h help message] 00:07:15.111 [-q queue depth per core] 00:07:15.111 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:15.111 [-T number of threads per core 00:07:15.111 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:15.111 [-t time in seconds] 00:07:15.111 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:15.111 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:15.111 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:15.111 [-l for compress/decompress workloads, name of uncompressed input file 00:07:15.111 [-S for crc32c workload, use this seed value (default 0) 00:07:15.111 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:15.111 [-f for fill workload, use this BYTE value (default 255) 00:07:15.111 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:15.111 [-y verify result if this switch is on] 00:07:15.111 [-a tasks to allocate per core (default: same value as -q)] 00:07:15.111 Can be used to spread operations across a wider range of memory. 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:15.111 00:07:15.111 real 0m0.044s 00:07:15.111 user 0m0.029s 00:07:15.111 sys 0m0.015s 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.111 13:25:54 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:15.111 ************************************ 00:07:15.111 END TEST accel_wrong_workload 00:07:15.111 ************************************ 00:07:15.111 Error: writing output failed: Broken pipe 00:07:15.370 13:25:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:15.370 13:25:54 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:15.370 13:25:54 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:15.370 13:25:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.370 13:25:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.370 ************************************ 00:07:15.370 START TEST accel_negative_buffers 00:07:15.370 ************************************ 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:15.370 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:15.370 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:15.370 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:15.370 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:15.371 13:25:54 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:15.371 -x option must be non-negative. 00:07:15.371 [2024-07-15 13:25:54.631078] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:15.371 accel_perf options: 00:07:15.371 [-h help message] 00:07:15.371 [-q queue depth per core] 00:07:15.371 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:15.371 [-T number of threads per core 00:07:15.371 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:15.371 [-t time in seconds] 00:07:15.371 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:15.371 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:15.371 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:15.371 [-l for compress/decompress workloads, name of uncompressed input file 00:07:15.371 [-S for crc32c workload, use this seed value (default 0) 00:07:15.371 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:15.371 [-f for fill workload, use this BYTE value (default 255) 00:07:15.371 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:15.371 [-y verify result if this switch is on] 00:07:15.371 [-a tasks to allocate per core (default: same value as -q)] 00:07:15.371 Can be used to spread operations across a wider range of memory. 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:15.371 00:07:15.371 real 0m0.043s 00:07:15.371 user 0m0.023s 00:07:15.371 sys 0m0.020s 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.371 13:25:54 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:15.371 ************************************ 00:07:15.371 END TEST accel_negative_buffers 00:07:15.371 ************************************ 00:07:15.371 Error: writing output failed: Broken pipe 00:07:15.371 13:25:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:15.371 13:25:54 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:15.371 13:25:54 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:15.371 13:25:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.371 13:25:54 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.371 ************************************ 00:07:15.371 START TEST accel_crc32c 00:07:15.371 ************************************ 00:07:15.371 13:25:54 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:15.371 13:25:54 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:15.371 [2024-07-15 13:25:54.749289] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:15.371 [2024-07-15 13:25:54.749357] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039475 ] 00:07:15.630 [2024-07-15 13:25:54.879639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.630 [2024-07-15 13:25:54.980982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.630 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:15.890 13:25:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:16.827 13:25:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.827 00:07:16.827 real 0m1.508s 00:07:16.827 user 0m1.315s 00:07:16.827 sys 0m0.199s 00:07:16.827 13:25:56 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.827 13:25:56 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:16.827 ************************************ 00:07:16.827 END TEST accel_crc32c 00:07:16.827 ************************************ 00:07:17.087 13:25:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:17.087 13:25:56 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:17.087 13:25:56 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:17.087 13:25:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.087 13:25:56 accel -- common/autotest_common.sh@10 -- # set +x 00:07:17.087 ************************************ 00:07:17.087 START TEST accel_crc32c_C2 00:07:17.087 ************************************ 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:17.087 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:17.087 [2024-07-15 13:25:56.338033] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:17.087 [2024-07-15 13:25:56.338098] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039763 ] 00:07:17.087 [2024-07-15 13:25:56.469965] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.346 [2024-07-15 13:25:56.577001] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:17.346 13:25:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.750 00:07:18.750 real 0m1.513s 00:07:18.750 user 0m1.323s 00:07:18.750 sys 0m0.197s 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.750 13:25:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:18.750 ************************************ 00:07:18.750 END TEST accel_crc32c_C2 00:07:18.750 ************************************ 00:07:18.750 13:25:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:18.750 13:25:57 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:18.750 13:25:57 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:18.750 13:25:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.750 13:25:57 accel -- common/autotest_common.sh@10 -- # set +x 00:07:18.750 ************************************ 00:07:18.750 START TEST accel_copy 00:07:18.750 ************************************ 00:07:18.750 13:25:57 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:18.750 13:25:57 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:18.750 [2024-07-15 13:25:57.933520] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:18.750 [2024-07-15 13:25:57.933581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2039958 ] 00:07:18.750 [2024-07-15 13:25:58.058654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.750 [2024-07-15 13:25:58.157138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.008 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:19.009 13:25:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:20.383 13:25:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.383 00:07:20.383 real 0m1.487s 00:07:20.383 user 0m1.304s 00:07:20.383 sys 0m0.187s 00:07:20.383 13:25:59 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.383 13:25:59 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:20.383 ************************************ 00:07:20.383 END TEST accel_copy 00:07:20.383 ************************************ 00:07:20.383 13:25:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:20.383 13:25:59 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:20.383 13:25:59 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:20.383 13:25:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.383 13:25:59 accel -- common/autotest_common.sh@10 -- # set +x 00:07:20.383 ************************************ 00:07:20.383 START TEST accel_fill 00:07:20.383 ************************************ 00:07:20.383 13:25:59 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:20.383 [2024-07-15 13:25:59.499763] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:20.383 [2024-07-15 13:25:59.499824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2040161 ] 00:07:20.383 [2024-07-15 13:25:59.628238] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.383 [2024-07-15 13:25:59.729189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.383 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.384 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:20.642 13:25:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:21.631 13:26:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.631 00:07:21.631 real 0m1.509s 00:07:21.631 user 0m1.319s 00:07:21.631 sys 0m0.194s 00:07:21.631 13:26:00 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.631 13:26:00 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:21.631 ************************************ 00:07:21.631 END TEST accel_fill 00:07:21.631 ************************************ 00:07:21.631 13:26:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:21.631 13:26:01 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:21.631 13:26:01 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:21.631 13:26:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.631 13:26:01 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.631 ************************************ 00:07:21.631 START TEST accel_copy_crc32c 00:07:21.631 ************************************ 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:21.631 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:21.891 [2024-07-15 13:26:01.085659] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:21.891 [2024-07-15 13:26:01.085724] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2040359 ] 00:07:21.891 [2024-07-15 13:26:01.215329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.150 [2024-07-15 13:26:01.320849] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.150 13:26:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.526 00:07:23.526 real 0m1.526s 00:07:23.526 user 0m1.326s 00:07:23.526 sys 0m0.197s 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.526 13:26:02 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:23.526 ************************************ 00:07:23.526 END TEST accel_copy_crc32c 00:07:23.526 ************************************ 00:07:23.526 13:26:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:23.526 13:26:02 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:23.526 13:26:02 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:23.526 13:26:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.526 13:26:02 accel -- common/autotest_common.sh@10 -- # set +x 00:07:23.526 ************************************ 00:07:23.526 START TEST accel_copy_crc32c_C2 00:07:23.526 ************************************ 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:23.526 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:23.526 [2024-07-15 13:26:02.695119] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:23.526 [2024-07-15 13:26:02.695181] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2040555 ] 00:07:23.526 [2024-07-15 13:26:02.825094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.526 [2024-07-15 13:26:02.926979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:23.785 13:26:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.162 00:07:25.162 real 0m1.512s 00:07:25.162 user 0m1.321s 00:07:25.162 sys 0m0.196s 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.162 13:26:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:25.162 ************************************ 00:07:25.162 END TEST accel_copy_crc32c_C2 00:07:25.162 ************************************ 00:07:25.162 13:26:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:25.162 13:26:04 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:25.162 13:26:04 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:25.162 13:26:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.162 13:26:04 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.162 ************************************ 00:07:25.162 START TEST accel_dualcast 00:07:25.162 ************************************ 00:07:25.162 13:26:04 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:25.162 [2024-07-15 13:26:04.290768] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:25.162 [2024-07-15 13:26:04.290832] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2040846 ] 00:07:25.162 [2024-07-15 13:26:04.418715] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.162 [2024-07-15 13:26:04.517160] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.162 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.420 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:25.421 13:26:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:26.355 13:26:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.355 00:07:26.355 real 0m1.492s 00:07:26.355 user 0m1.300s 00:07:26.355 sys 0m0.189s 00:07:26.355 13:26:05 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.355 13:26:05 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:26.355 ************************************ 00:07:26.355 END TEST accel_dualcast 00:07:26.355 ************************************ 00:07:26.613 13:26:05 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:26.613 13:26:05 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:26.613 13:26:05 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:26.613 13:26:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.613 13:26:05 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.613 ************************************ 00:07:26.613 START TEST accel_compare 00:07:26.613 ************************************ 00:07:26.613 13:26:05 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:26.613 13:26:05 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:26.613 [2024-07-15 13:26:05.864249] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:26.613 [2024-07-15 13:26:05.864315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2041108 ] 00:07:26.613 [2024-07-15 13:26:05.993966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.871 [2024-07-15 13:26:06.095603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:26.871 13:26:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:28.245 13:26:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.245 00:07:28.245 real 0m1.502s 00:07:28.245 user 0m1.311s 00:07:28.245 sys 0m0.195s 00:07:28.245 13:26:07 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.245 13:26:07 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:28.245 ************************************ 00:07:28.245 END TEST accel_compare 00:07:28.245 ************************************ 00:07:28.245 13:26:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:28.245 13:26:07 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:28.245 13:26:07 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:28.245 13:26:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.245 13:26:07 accel -- common/autotest_common.sh@10 -- # set +x 00:07:28.245 ************************************ 00:07:28.245 START TEST accel_xor 00:07:28.245 ************************************ 00:07:28.245 13:26:07 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:28.245 13:26:07 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:28.245 [2024-07-15 13:26:07.448373] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:28.245 [2024-07-15 13:26:07.448435] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2041308 ] 00:07:28.245 [2024-07-15 13:26:07.577849] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.504 [2024-07-15 13:26:07.679480] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.504 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:28.505 13:26:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:29.882 13:26:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.882 00:07:29.882 real 0m1.507s 00:07:29.882 user 0m1.320s 00:07:29.882 sys 0m0.194s 00:07:29.882 13:26:08 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.882 13:26:08 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:29.882 ************************************ 00:07:29.882 END TEST accel_xor 00:07:29.882 ************************************ 00:07:29.882 13:26:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:29.882 13:26:08 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:29.882 13:26:08 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:29.882 13:26:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.882 13:26:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:29.882 ************************************ 00:07:29.882 START TEST accel_xor 00:07:29.882 ************************************ 00:07:29.882 13:26:09 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:29.882 13:26:09 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:29.882 [2024-07-15 13:26:09.034025] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:29.882 [2024-07-15 13:26:09.034090] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2041501 ] 00:07:29.882 [2024-07-15 13:26:09.162753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.882 [2024-07-15 13:26:09.264846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:30.142 13:26:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:31.520 13:26:10 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.520 00:07:31.520 real 0m1.514s 00:07:31.520 user 0m1.316s 00:07:31.520 sys 0m0.197s 00:07:31.520 13:26:10 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.520 13:26:10 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:31.520 ************************************ 00:07:31.520 END TEST accel_xor 00:07:31.520 ************************************ 00:07:31.520 13:26:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:31.520 13:26:10 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:31.520 13:26:10 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:31.520 13:26:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.520 13:26:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.520 ************************************ 00:07:31.520 START TEST accel_dif_verify 00:07:31.520 ************************************ 00:07:31.520 13:26:10 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:31.520 [2024-07-15 13:26:10.615465] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:31.520 [2024-07-15 13:26:10.615525] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2041701 ] 00:07:31.520 [2024-07-15 13:26:10.745734] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.520 [2024-07-15 13:26:10.849041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:31.520 13:26:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:32.898 13:26:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:32.898 00:07:32.898 real 0m1.513s 00:07:32.898 user 0m1.323s 00:07:32.898 sys 0m0.192s 00:07:32.898 13:26:12 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.898 13:26:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:32.898 ************************************ 00:07:32.898 END TEST accel_dif_verify 00:07:32.898 ************************************ 00:07:32.898 13:26:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:32.898 13:26:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:32.898 13:26:12 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:32.898 13:26:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.898 13:26:12 accel -- common/autotest_common.sh@10 -- # set +x 00:07:32.898 ************************************ 00:07:32.898 START TEST accel_dif_generate 00:07:32.898 ************************************ 00:07:32.898 13:26:12 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:32.898 13:26:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:32.898 [2024-07-15 13:26:12.209072] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:32.898 [2024-07-15 13:26:12.209133] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2041950 ] 00:07:33.161 [2024-07-15 13:26:12.338397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.161 [2024-07-15 13:26:12.438959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.161 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.162 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:33.162 13:26:12 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:33.162 13:26:12 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:33.162 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:33.162 13:26:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:34.540 13:26:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.540 00:07:34.540 real 0m1.497s 00:07:34.540 user 0m1.310s 00:07:34.540 sys 0m0.186s 00:07:34.540 13:26:13 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.540 13:26:13 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:34.540 ************************************ 00:07:34.540 END TEST accel_dif_generate 00:07:34.540 ************************************ 00:07:34.540 13:26:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:34.540 13:26:13 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:34.540 13:26:13 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:34.540 13:26:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.540 13:26:13 accel -- common/autotest_common.sh@10 -- # set +x 00:07:34.540 ************************************ 00:07:34.540 START TEST accel_dif_generate_copy 00:07:34.540 ************************************ 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:34.540 13:26:13 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:07:34.540 [2024-07-15 13:26:13.783273] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:34.540 [2024-07-15 13:26:13.783334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042257 ] 00:07:34.540 [2024-07-15 13:26:13.913082] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.801 [2024-07-15 13:26:14.014428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:34.801 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:34.802 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:34.802 13:26:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.178 00:07:36.178 real 0m1.509s 00:07:36.178 user 0m1.322s 00:07:36.178 sys 0m0.191s 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.178 13:26:15 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:36.178 ************************************ 00:07:36.178 END TEST accel_dif_generate_copy 00:07:36.178 ************************************ 00:07:36.178 13:26:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:36.178 13:26:15 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:36.178 13:26:15 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.178 13:26:15 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:36.178 13:26:15 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.178 13:26:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:36.178 ************************************ 00:07:36.178 START TEST accel_comp 00:07:36.178 ************************************ 00:07:36.178 13:26:15 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:36.178 13:26:15 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:07:36.179 [2024-07-15 13:26:15.367735] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:36.179 [2024-07-15 13:26:15.367793] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042456 ] 00:07:36.179 [2024-07-15 13:26:15.496219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.179 [2024-07-15 13:26:15.597146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:36.438 13:26:15 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:36.439 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:36.439 13:26:15 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:37.818 13:26:16 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:37.818 00:07:37.818 real 0m1.513s 00:07:37.818 user 0m1.319s 00:07:37.818 sys 0m0.195s 00:07:37.818 13:26:16 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.818 13:26:16 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:37.818 ************************************ 00:07:37.818 END TEST accel_comp 00:07:37.818 ************************************ 00:07:37.818 13:26:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:37.818 13:26:16 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:37.818 13:26:16 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:37.818 13:26:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.818 13:26:16 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.818 ************************************ 00:07:37.818 START TEST accel_decomp 00:07:37.818 ************************************ 00:07:37.818 13:26:16 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:37.818 13:26:16 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:37.818 [2024-07-15 13:26:16.961636] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:37.818 [2024-07-15 13:26:16.961698] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042652 ] 00:07:37.818 [2024-07-15 13:26:17.090521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.818 [2024-07-15 13:26:17.191448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.077 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.077 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:38.078 13:26:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:39.074 13:26:18 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.074 00:07:39.074 real 0m1.511s 00:07:39.074 user 0m1.312s 00:07:39.074 sys 0m0.205s 00:07:39.074 13:26:18 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.074 13:26:18 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:39.074 ************************************ 00:07:39.074 END TEST accel_decomp 00:07:39.074 ************************************ 00:07:39.074 13:26:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:39.074 13:26:18 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.074 13:26:18 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:39.074 13:26:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.074 13:26:18 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.333 ************************************ 00:07:39.333 START TEST accel_decomp_full 00:07:39.333 ************************************ 00:07:39.333 13:26:18 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:39.333 13:26:18 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:39.333 [2024-07-15 13:26:18.526730] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:39.333 [2024-07-15 13:26:18.526774] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2042851 ] 00:07:39.333 [2024-07-15 13:26:18.637954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.333 [2024-07-15 13:26:18.745180] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:39.592 13:26:18 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:40.971 13:26:19 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:40.971 00:07:40.971 real 0m1.491s 00:07:40.971 user 0m0.011s 00:07:40.971 sys 0m0.003s 00:07:40.971 13:26:19 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.971 13:26:19 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:40.971 ************************************ 00:07:40.971 END TEST accel_decomp_full 00:07:40.971 ************************************ 00:07:40.971 13:26:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:40.971 13:26:20 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:40.971 13:26:20 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:40.971 13:26:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.971 13:26:20 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.971 ************************************ 00:07:40.971 START TEST accel_decomp_mcore 00:07:40.971 ************************************ 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:40.971 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:40.971 [2024-07-15 13:26:20.115892] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:40.971 [2024-07-15 13:26:20.115968] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043051 ] 00:07:40.971 [2024-07-15 13:26:20.246403] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:40.971 [2024-07-15 13:26:20.348207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:40.971 [2024-07-15 13:26:20.348292] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:40.971 [2024-07-15 13:26:20.348370] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:40.971 [2024-07-15 13:26:20.348374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:41.230 13:26:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.165 00:07:42.165 real 0m1.502s 00:07:42.165 user 0m4.707s 00:07:42.165 sys 0m0.200s 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.165 13:26:21 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:42.165 ************************************ 00:07:42.165 END TEST accel_decomp_mcore 00:07:42.165 ************************************ 00:07:42.424 13:26:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:42.424 13:26:21 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:42.424 13:26:21 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:42.424 13:26:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.424 13:26:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.424 ************************************ 00:07:42.424 START TEST accel_decomp_full_mcore 00:07:42.424 ************************************ 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:42.424 13:26:21 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:42.424 [2024-07-15 13:26:21.699228] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:42.424 [2024-07-15 13:26:21.699293] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043335 ] 00:07:42.424 [2024-07-15 13:26:21.831217] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:42.683 [2024-07-15 13:26:21.942399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.683 [2024-07-15 13:26:21.942482] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:42.683 [2024-07-15 13:26:21.942557] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:42.683 [2024-07-15 13:26:21.942561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:42.683 13:26:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.060 00:07:44.060 real 0m1.545s 00:07:44.060 user 0m4.815s 00:07:44.060 sys 0m0.215s 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.060 13:26:23 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:44.060 ************************************ 00:07:44.060 END TEST accel_decomp_full_mcore 00:07:44.060 ************************************ 00:07:44.060 13:26:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.060 13:26:23 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:44.060 13:26:23 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:44.060 13:26:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.060 13:26:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.060 ************************************ 00:07:44.060 START TEST accel_decomp_mthread 00:07:44.060 ************************************ 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:44.060 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:44.060 [2024-07-15 13:26:23.312912] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:44.060 [2024-07-15 13:26:23.312991] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043608 ] 00:07:44.060 [2024-07-15 13:26:23.440964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.319 [2024-07-15 13:26:23.543727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:44.319 13:26:23 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.708 00:07:45.708 real 0m1.522s 00:07:45.708 user 0m1.322s 00:07:45.708 sys 0m0.203s 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:45.708 13:26:24 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:45.708 ************************************ 00:07:45.708 END TEST accel_decomp_mthread 00:07:45.708 ************************************ 00:07:45.708 13:26:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.708 13:26:24 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:45.708 13:26:24 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:45.708 13:26:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.708 13:26:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.708 ************************************ 00:07:45.708 START TEST accel_decomp_full_mthread 00:07:45.708 ************************************ 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:45.708 13:26:24 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:45.708 [2024-07-15 13:26:24.925349] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:45.708 [2024-07-15 13:26:24.925475] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2043805 ] 00:07:45.708 [2024-07-15 13:26:25.121394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.967 [2024-07-15 13:26:25.224626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:45.967 13:26:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.345 00:07:47.345 real 0m1.607s 00:07:47.345 user 0m1.364s 00:07:47.345 sys 0m0.246s 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.345 13:26:26 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:47.345 ************************************ 00:07:47.345 END TEST accel_decomp_full_mthread 00:07:47.345 ************************************ 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.345 13:26:26 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:47.345 13:26:26 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:47.345 13:26:26 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:47.345 13:26:26 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:47.345 13:26:26 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2044003 00:07:47.345 13:26:26 accel -- accel/accel.sh@63 -- # waitforlisten 2044003 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@829 -- # '[' -z 2044003 ']' 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:47.345 13:26:26 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:47.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:47.345 13:26:26 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:47.345 13:26:26 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.345 13:26:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.345 13:26:26 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.345 13:26:26 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.345 13:26:26 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.345 13:26:26 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:47.345 13:26:26 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:47.345 13:26:26 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:47.345 13:26:26 accel -- accel/accel.sh@41 -- # jq -r . 00:07:47.345 [2024-07-15 13:26:26.591752] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:47.345 [2024-07-15 13:26:26.591820] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044003 ] 00:07:47.345 [2024-07-15 13:26:26.719096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.605 [2024-07-15 13:26:26.817787] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.172 [2024-07-15 13:26:27.586809] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:48.431 13:26:27 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:48.431 13:26:27 accel -- common/autotest_common.sh@862 -- # return 0 00:07:48.431 13:26:27 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:48.431 13:26:27 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:48.431 13:26:27 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:48.431 13:26:27 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:48.431 13:26:27 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:48.431 13:26:27 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:48.431 13:26:27 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.431 13:26:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.431 13:26:27 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:48.431 13:26:27 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.691 "method": "compressdev_scan_accel_module", 00:07:48.691 13:26:27 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:48.691 13:26:27 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.691 13:26:27 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:48.691 13:26:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:48.691 13:26:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:48.691 13:26:27 accel -- accel/accel.sh@75 -- # killprocess 2044003 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@948 -- # '[' -z 2044003 ']' 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@952 -- # kill -0 2044003 00:07:48.691 13:26:27 accel -- common/autotest_common.sh@953 -- # uname 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2044003 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2044003' 00:07:48.691 killing process with pid 2044003 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@967 -- # kill 2044003 00:07:48.691 13:26:28 accel -- common/autotest_common.sh@972 -- # wait 2044003 00:07:49.260 13:26:28 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:49.260 13:26:28 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.260 13:26:28 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:49.260 13:26:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.260 13:26:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.260 ************************************ 00:07:49.260 START TEST accel_cdev_comp 00:07:49.260 ************************************ 00:07:49.260 13:26:28 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:49.260 13:26:28 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:49.260 [2024-07-15 13:26:28.520590] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:49.260 [2024-07-15 13:26:28.520653] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044295 ] 00:07:49.260 [2024-07-15 13:26:28.652135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.520 [2024-07-15 13:26:28.757942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.457 [2024-07-15 13:26:29.517880] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:50.457 [2024-07-15 13:26:29.520487] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdfc080 PMD being used: compress_qat 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 [2024-07-15 13:26:29.524606] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe00e60 PMD being used: compress_qat 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.457 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:50.458 13:26:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:51.395 13:26:30 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:51.395 00:07:51.395 real 0m2.223s 00:07:51.395 user 0m1.630s 00:07:51.395 sys 0m0.596s 00:07:51.395 13:26:30 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:51.395 13:26:30 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:51.395 ************************************ 00:07:51.395 END TEST accel_cdev_comp 00:07:51.395 ************************************ 00:07:51.395 13:26:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:51.395 13:26:30 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:51.395 13:26:30 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:51.395 13:26:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.395 13:26:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.395 ************************************ 00:07:51.395 START TEST accel_cdev_decomp 00:07:51.395 ************************************ 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:51.395 13:26:30 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:51.395 [2024-07-15 13:26:30.819397] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:51.395 [2024-07-15 13:26:30.819461] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044574 ] 00:07:51.654 [2024-07-15 13:26:30.950759] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.654 [2024-07-15 13:26:31.053407] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.593 [2024-07-15 13:26:31.818211] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:52.593 [2024-07-15 13:26:31.820844] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d67080 PMD being used: compress_qat 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 [2024-07-15 13:26:31.825059] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d6be60 PMD being used: compress_qat 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:52.593 13:26:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:54.000 00:07:54.000 real 0m2.209s 00:07:54.000 user 0m1.635s 00:07:54.000 sys 0m0.572s 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.000 13:26:32 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:54.000 ************************************ 00:07:54.000 END TEST accel_cdev_decomp 00:07:54.000 ************************************ 00:07:54.000 13:26:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:54.000 13:26:33 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:54.000 13:26:33 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:54.000 13:26:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.000 13:26:33 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.000 ************************************ 00:07:54.000 START TEST accel_cdev_decomp_full 00:07:54.000 ************************************ 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:54.000 13:26:33 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:54.000 [2024-07-15 13:26:33.098543] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:54.000 [2024-07-15 13:26:33.098607] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2044953 ] 00:07:54.000 [2024-07-15 13:26:33.229078] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.000 [2024-07-15 13:26:33.329889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.933 [2024-07-15 13:26:34.101661] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:54.933 [2024-07-15 13:26:34.104257] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1883080 PMD being used: compress_qat 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.933 [2024-07-15 13:26:34.107540] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1882ce0 PMD being used: compress_qat 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.933 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:54.934 13:26:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:55.927 00:07:55.927 real 0m2.226s 00:07:55.927 user 0m1.649s 00:07:55.927 sys 0m0.579s 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.927 13:26:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:55.927 ************************************ 00:07:55.927 END TEST accel_cdev_decomp_full 00:07:55.927 ************************************ 00:07:55.927 13:26:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.927 13:26:35 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:55.927 13:26:35 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:55.927 13:26:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.927 13:26:35 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.186 ************************************ 00:07:56.186 START TEST accel_cdev_decomp_mcore 00:07:56.186 ************************************ 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:56.186 13:26:35 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:56.186 [2024-07-15 13:26:35.395451] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:56.186 [2024-07-15 13:26:35.395518] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045290 ] 00:07:56.186 [2024-07-15 13:26:35.527108] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:56.445 [2024-07-15 13:26:35.629493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:56.445 [2024-07-15 13:26:35.629580] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:56.445 [2024-07-15 13:26:35.629655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:56.445 [2024-07-15 13:26:35.629659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.013 [2024-07-15 13:26:36.382946] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:57.013 [2024-07-15 13:26:36.385573] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2423720 PMD being used: compress_qat 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 [2024-07-15 13:26:36.391307] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f042419b8b0 PMD being used: compress_qat 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 [2024-07-15 13:26:36.392087] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f041c19b8b0 PMD being used: compress_qat 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 [2024-07-15 13:26:36.393240] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24289f0 PMD being used: compress_qat 00:07:57.013 [2024-07-15 13:26:36.393425] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f041419b8b0 PMD being used: compress_qat 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:57.013 13:26:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:58.387 00:07:58.387 real 0m2.234s 00:07:58.387 user 0m7.190s 00:07:58.387 sys 0m0.612s 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.387 13:26:37 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:58.387 ************************************ 00:07:58.387 END TEST accel_cdev_decomp_mcore 00:07:58.387 ************************************ 00:07:58.387 13:26:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:58.387 13:26:37 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:58.387 13:26:37 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:58.387 13:26:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.387 13:26:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.387 ************************************ 00:07:58.387 START TEST accel_cdev_decomp_full_mcore 00:07:58.387 ************************************ 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:58.387 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:58.388 13:26:37 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:58.388 [2024-07-15 13:26:37.706685] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:07:58.388 [2024-07-15 13:26:37.706745] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045522 ] 00:07:58.646 [2024-07-15 13:26:37.839433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:58.646 [2024-07-15 13:26:37.944966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:58.646 [2024-07-15 13:26:37.945011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:58.646 [2024-07-15 13:26:37.945088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:58.646 [2024-07-15 13:26:37.945093] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.580 [2024-07-15 13:26:38.705665] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:59.580 [2024-07-15 13:26:38.708281] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22c0720 PMD being used: compress_qat 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 [2024-07-15 13:26:38.713054] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1ff819b8b0 PMD being used: compress_qat 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 [2024-07-15 13:26:38.713812] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1ff019b8b0 PMD being used: compress_qat 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 [2024-07-15 13:26:38.714966] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22c3a30 PMD being used: compress_qat 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 [2024-07-15 13:26:38.715180] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1fe819b8b0 PMD being used: compress_qat 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:59.580 13:26:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:00.516 00:08:00.516 real 0m2.240s 00:08:00.516 user 0m7.215s 00:08:00.516 sys 0m0.593s 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.516 13:26:39 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:00.516 ************************************ 00:08:00.516 END TEST accel_cdev_decomp_full_mcore 00:08:00.516 ************************************ 00:08:00.781 13:26:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.781 13:26:39 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:00.781 13:26:39 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:00.781 13:26:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.781 13:26:39 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.781 ************************************ 00:08:00.781 START TEST accel_cdev_decomp_mthread 00:08:00.781 ************************************ 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:00.781 13:26:39 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:00.781 [2024-07-15 13:26:40.025993] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:00.781 [2024-07-15 13:26:40.026063] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2045895 ] 00:08:00.781 [2024-07-15 13:26:40.158372] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.038 [2024-07-15 13:26:40.260803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.604 [2024-07-15 13:26:41.016520] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:01.604 [2024-07-15 13:26:41.019077] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x242b080 PMD being used: compress_qat 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.604 [2024-07-15 13:26:41.023971] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24302a0 PMD being used: compress_qat 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 [2024-07-15 13:26:41.026513] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25530f0 PMD being used: compress_qat 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.604 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.862 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:01.863 13:26:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:02.798 00:08:02.798 real 0m2.213s 00:08:02.798 user 0m1.633s 00:08:02.798 sys 0m0.583s 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.798 13:26:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:02.798 ************************************ 00:08:02.798 END TEST accel_cdev_decomp_mthread 00:08:02.798 ************************************ 00:08:03.057 13:26:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.057 13:26:42 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:03.057 13:26:42 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:03.057 13:26:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.057 13:26:42 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.057 ************************************ 00:08:03.057 START TEST accel_cdev_decomp_full_mthread 00:08:03.057 ************************************ 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:03.057 13:26:42 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:03.057 [2024-07-15 13:26:42.301351] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:03.057 [2024-07-15 13:26:42.301410] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046259 ] 00:08:03.057 [2024-07-15 13:26:42.428638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.316 [2024-07-15 13:26:42.529386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.251 [2024-07-15 13:26:43.323070] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:04.251 [2024-07-15 13:26:43.325717] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcf2080 PMD being used: compress_qat 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.251 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.251 [2024-07-15 13:26:43.329948] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcf53b0 PMD being used: compress_qat 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.252 [2024-07-15 13:26:43.332834] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe19cc0 PMD being used: compress_qat 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:04.252 13:26:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:05.190 00:08:05.190 real 0m2.241s 00:08:05.190 user 0m1.669s 00:08:05.190 sys 0m0.574s 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.190 13:26:44 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:05.190 ************************************ 00:08:05.190 END TEST accel_cdev_decomp_full_mthread 00:08:05.190 ************************************ 00:08:05.190 13:26:44 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.190 13:26:44 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:05.190 13:26:44 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:05.190 13:26:44 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:05.190 13:26:44 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:05.190 13:26:44 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.190 13:26:44 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.190 13:26:44 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.190 13:26:44 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.190 13:26:44 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.190 13:26:44 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.190 13:26:44 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.190 13:26:44 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:05.190 13:26:44 accel -- accel/accel.sh@41 -- # jq -r . 00:08:05.191 ************************************ 00:08:05.191 START TEST accel_dif_functional_tests 00:08:05.191 ************************************ 00:08:05.191 13:26:44 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:05.451 [2024-07-15 13:26:44.659178] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:05.451 [2024-07-15 13:26:44.659248] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046494 ] 00:08:05.451 [2024-07-15 13:26:44.792058] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:05.710 [2024-07-15 13:26:44.901313] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:05.710 [2024-07-15 13:26:44.901405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:05.710 [2024-07-15 13:26:44.901409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.710 00:08:05.710 00:08:05.710 CUnit - A unit testing framework for C - Version 2.1-3 00:08:05.710 http://cunit.sourceforge.net/ 00:08:05.710 00:08:05.710 00:08:05.710 Suite: accel_dif 00:08:05.710 Test: verify: DIF generated, GUARD check ...passed 00:08:05.710 Test: verify: DIF generated, APPTAG check ...passed 00:08:05.710 Test: verify: DIF generated, REFTAG check ...passed 00:08:05.710 Test: verify: DIF not generated, GUARD check ...[2024-07-15 13:26:45.000837] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:05.710 passed 00:08:05.710 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 13:26:45.000909] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:05.710 passed 00:08:05.710 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 13:26:45.000947] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:05.710 passed 00:08:05.710 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:05.710 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 13:26:45.001017] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:05.710 passed 00:08:05.710 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:05.710 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:05.710 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:05.710 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 13:26:45.001163] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:05.710 passed 00:08:05.710 Test: verify copy: DIF generated, GUARD check ...passed 00:08:05.710 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:05.710 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:05.710 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 13:26:45.001321] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:05.710 passed 00:08:05.710 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 13:26:45.001355] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:05.710 passed 00:08:05.710 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 13:26:45.001390] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:05.710 passed 00:08:05.710 Test: generate copy: DIF generated, GUARD check ...passed 00:08:05.710 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:05.710 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:05.710 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:05.710 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:05.710 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:05.710 Test: generate copy: iovecs-len validate ...[2024-07-15 13:26:45.001643] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:05.710 passed 00:08:05.710 Test: generate copy: buffer alignment validate ...passed 00:08:05.710 00:08:05.710 Run Summary: Type Total Ran Passed Failed Inactive 00:08:05.710 suites 1 1 n/a 0 0 00:08:05.710 tests 26 26 26 0 0 00:08:05.710 asserts 115 115 115 0 n/a 00:08:05.710 00:08:05.710 Elapsed time = 0.003 seconds 00:08:05.970 00:08:05.970 real 0m0.616s 00:08:05.970 user 0m0.806s 00:08:05.970 sys 0m0.226s 00:08:05.970 13:26:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.970 13:26:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:05.970 ************************************ 00:08:05.970 END TEST accel_dif_functional_tests 00:08:05.970 ************************************ 00:08:05.970 13:26:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.970 00:08:05.970 real 0m53.674s 00:08:05.970 user 1m1.783s 00:08:05.970 sys 0m11.978s 00:08:05.970 13:26:45 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.970 13:26:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.970 ************************************ 00:08:05.970 END TEST accel 00:08:05.970 ************************************ 00:08:05.970 13:26:45 -- common/autotest_common.sh@1142 -- # return 0 00:08:05.970 13:26:45 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:05.970 13:26:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:05.970 13:26:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.970 13:26:45 -- common/autotest_common.sh@10 -- # set +x 00:08:05.970 ************************************ 00:08:05.970 START TEST accel_rpc 00:08:05.970 ************************************ 00:08:05.970 13:26:45 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:06.229 * Looking for test storage... 00:08:06.229 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:06.229 13:26:45 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:06.229 13:26:45 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2046695 00:08:06.229 13:26:45 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:06.229 13:26:45 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2046695 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2046695 ']' 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:06.229 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:06.229 13:26:45 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:06.229 [2024-07-15 13:26:45.493027] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:06.229 [2024-07-15 13:26:45.493082] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2046695 ] 00:08:06.229 [2024-07-15 13:26:45.605694] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.488 [2024-07-15 13:26:45.712825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.056 13:26:46 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:07.056 13:26:46 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:07.056 13:26:46 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:07.056 13:26:46 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:07.056 13:26:46 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:07.056 13:26:46 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:07.056 13:26:46 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:07.056 13:26:46 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:07.056 13:26:46 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.056 13:26:46 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:07.056 ************************************ 00:08:07.056 START TEST accel_assign_opcode 00:08:07.056 ************************************ 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:07.056 [2024-07-15 13:26:46.378976] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:07.056 [2024-07-15 13:26:46.386989] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.056 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.316 software 00:08:07.316 00:08:07.316 real 0m0.298s 00:08:07.316 user 0m0.043s 00:08:07.316 sys 0m0.009s 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.316 13:26:46 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:07.316 ************************************ 00:08:07.316 END TEST accel_assign_opcode 00:08:07.316 ************************************ 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:07.316 13:26:46 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2046695 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2046695 ']' 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2046695 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:07.316 13:26:46 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2046695 00:08:07.576 13:26:46 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:07.576 13:26:46 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:07.576 13:26:46 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2046695' 00:08:07.576 killing process with pid 2046695 00:08:07.576 13:26:46 accel_rpc -- common/autotest_common.sh@967 -- # kill 2046695 00:08:07.576 13:26:46 accel_rpc -- common/autotest_common.sh@972 -- # wait 2046695 00:08:07.836 00:08:07.836 real 0m1.814s 00:08:07.836 user 0m1.836s 00:08:07.836 sys 0m0.528s 00:08:07.836 13:26:47 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.836 13:26:47 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:07.836 ************************************ 00:08:07.836 END TEST accel_rpc 00:08:07.836 ************************************ 00:08:07.836 13:26:47 -- common/autotest_common.sh@1142 -- # return 0 00:08:07.836 13:26:47 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:07.836 13:26:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:07.836 13:26:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.836 13:26:47 -- common/autotest_common.sh@10 -- # set +x 00:08:07.836 ************************************ 00:08:07.836 START TEST app_cmdline 00:08:07.836 ************************************ 00:08:07.836 13:26:47 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:08.095 * Looking for test storage... 00:08:08.095 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:08.095 13:26:47 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:08.095 13:26:47 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2047010 00:08:08.095 13:26:47 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:08.095 13:26:47 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2047010 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2047010 ']' 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:08.095 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:08.095 13:26:47 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:08.095 [2024-07-15 13:26:47.394552] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:08.095 [2024-07-15 13:26:47.394613] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047010 ] 00:08:08.095 [2024-07-15 13:26:47.507615] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.354 [2024-07-15 13:26:47.613353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.921 13:26:48 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:08.921 13:26:48 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:08.921 13:26:48 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:09.180 { 00:08:09.180 "version": "SPDK v24.09-pre git sha1 e7cce062d", 00:08:09.180 "fields": { 00:08:09.180 "major": 24, 00:08:09.180 "minor": 9, 00:08:09.180 "patch": 0, 00:08:09.180 "suffix": "-pre", 00:08:09.180 "commit": "e7cce062d" 00:08:09.180 } 00:08:09.180 } 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:09.180 13:26:48 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:09.180 13:26:48 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:09.181 13:26:48 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:09.181 13:26:48 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:09.181 13:26:48 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:09.181 13:26:48 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:09.181 13:26:48 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:09.439 request: 00:08:09.439 { 00:08:09.439 "method": "env_dpdk_get_mem_stats", 00:08:09.439 "req_id": 1 00:08:09.439 } 00:08:09.439 Got JSON-RPC error response 00:08:09.439 response: 00:08:09.439 { 00:08:09.439 "code": -32601, 00:08:09.439 "message": "Method not found" 00:08:09.439 } 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:09.439 13:26:48 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2047010 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2047010 ']' 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2047010 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2047010 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2047010' 00:08:09.439 killing process with pid 2047010 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@967 -- # kill 2047010 00:08:09.439 13:26:48 app_cmdline -- common/autotest_common.sh@972 -- # wait 2047010 00:08:10.005 00:08:10.005 real 0m1.901s 00:08:10.005 user 0m2.214s 00:08:10.005 sys 0m0.588s 00:08:10.005 13:26:49 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.005 13:26:49 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:10.005 ************************************ 00:08:10.005 END TEST app_cmdline 00:08:10.005 ************************************ 00:08:10.005 13:26:49 -- common/autotest_common.sh@1142 -- # return 0 00:08:10.005 13:26:49 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:10.005 13:26:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:10.005 13:26:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.005 13:26:49 -- common/autotest_common.sh@10 -- # set +x 00:08:10.005 ************************************ 00:08:10.005 START TEST version 00:08:10.005 ************************************ 00:08:10.005 13:26:49 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:10.005 * Looking for test storage... 00:08:10.005 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:10.005 13:26:49 version -- app/version.sh@17 -- # get_header_version major 00:08:10.005 13:26:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # cut -f2 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:10.005 13:26:49 version -- app/version.sh@17 -- # major=24 00:08:10.005 13:26:49 version -- app/version.sh@18 -- # get_header_version minor 00:08:10.005 13:26:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # cut -f2 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:10.005 13:26:49 version -- app/version.sh@18 -- # minor=9 00:08:10.005 13:26:49 version -- app/version.sh@19 -- # get_header_version patch 00:08:10.005 13:26:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # cut -f2 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:10.005 13:26:49 version -- app/version.sh@19 -- # patch=0 00:08:10.005 13:26:49 version -- app/version.sh@20 -- # get_header_version suffix 00:08:10.005 13:26:49 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # cut -f2 00:08:10.005 13:26:49 version -- app/version.sh@14 -- # tr -d '"' 00:08:10.005 13:26:49 version -- app/version.sh@20 -- # suffix=-pre 00:08:10.005 13:26:49 version -- app/version.sh@22 -- # version=24.9 00:08:10.005 13:26:49 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:10.005 13:26:49 version -- app/version.sh@28 -- # version=24.9rc0 00:08:10.005 13:26:49 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:10.005 13:26:49 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:10.005 13:26:49 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:10.005 13:26:49 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:10.005 00:08:10.005 real 0m0.187s 00:08:10.005 user 0m0.098s 00:08:10.005 sys 0m0.134s 00:08:10.005 13:26:49 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.005 13:26:49 version -- common/autotest_common.sh@10 -- # set +x 00:08:10.005 ************************************ 00:08:10.005 END TEST version 00:08:10.005 ************************************ 00:08:10.263 13:26:49 -- common/autotest_common.sh@1142 -- # return 0 00:08:10.263 13:26:49 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:10.263 13:26:49 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:10.263 13:26:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:10.263 13:26:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.263 13:26:49 -- common/autotest_common.sh@10 -- # set +x 00:08:10.263 ************************************ 00:08:10.263 START TEST blockdev_general 00:08:10.263 ************************************ 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:10.263 * Looking for test storage... 00:08:10.263 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:10.263 13:26:49 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2047419 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2047419 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2047419 ']' 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:10.263 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:10.263 13:26:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:10.263 13:26:49 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:10.552 [2024-07-15 13:26:49.732465] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:10.552 [2024-07-15 13:26:49.732603] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047419 ] 00:08:10.552 [2024-07-15 13:26:49.925615] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.811 [2024-07-15 13:26:50.035504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.745 13:26:50 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:11.745 13:26:50 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:11.745 13:26:50 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:11.745 13:26:50 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:11.745 13:26:50 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:11.745 13:26:50 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:11.745 13:26:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:11.746 [2024-07-15 13:26:51.105182] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:11.746 [2024-07-15 13:26:51.105234] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:11.746 00:08:11.746 [2024-07-15 13:26:51.113165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:11.746 [2024-07-15 13:26:51.113190] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:11.746 00:08:11.746 Malloc0 00:08:11.746 Malloc1 00:08:11.746 Malloc2 00:08:12.004 Malloc3 00:08:12.004 Malloc4 00:08:12.004 Malloc5 00:08:12.004 Malloc6 00:08:12.004 Malloc7 00:08:12.004 Malloc8 00:08:12.004 Malloc9 00:08:12.004 [2024-07-15 13:26:51.262036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:12.004 [2024-07-15 13:26:51.262083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:12.004 [2024-07-15 13:26:51.262105] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1794350 00:08:12.004 [2024-07-15 13:26:51.262117] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:12.004 [2024-07-15 13:26:51.263458] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:12.004 [2024-07-15 13:26:51.263485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:12.004 TestPT 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:12.004 5000+0 records in 00:08:12.004 5000+0 records out 00:08:12.004 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0253027 s, 405 MB/s 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.004 AIO0 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.004 13:26:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.004 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.261 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.261 13:26:51 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:12.261 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.262 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.262 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.262 13:26:51 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:12.262 13:26:51 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:12.262 13:26:51 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:12.262 13:26:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:12.262 13:26:51 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:12.262 13:26:51 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:12.262 13:26:51 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:12.262 13:26:51 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:12.263 13:26:51 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "26ffaa6f-666b-4317-bfd1-f2b64e05cac4"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "26ffaa6f-666b-4317-bfd1-f2b64e05cac4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "a7dba23a-8c7e-583d-abf3-26648a811565"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a7dba23a-8c7e-583d-abf3-26648a811565",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "3dd72c1d-4b48-5952-8451-a65e8c07a6e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3dd72c1d-4b48-5952-8451-a65e8c07a6e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3bc26509-470c-5ad5-bab0-c1814b88a99f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3bc26509-470c-5ad5-bab0-c1814b88a99f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "72d35a18-5d13-56a4-8fbf-60b23cf0375f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "72d35a18-5d13-56a4-8fbf-60b23cf0375f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "88b8e797-97ee-5460-a6ca-b28e077538c0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "88b8e797-97ee-5460-a6ca-b28e077538c0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "e799c281-7140-5f8d-bf44-c1d2f67d10df"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e799c281-7140-5f8d-bf44-c1d2f67d10df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f21b2f8f-ac17-5e44-8616-2818c301c778"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f21b2f8f-ac17-5e44-8616-2818c301c778",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "94d0730b-58a4-53f2-a681-9d59ded144b5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "94d0730b-58a4-53f2-a681-9d59ded144b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6d8b9fe5-4c76-545f-b848-0df8de5a731b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6d8b9fe5-4c76-545f-b848-0df8de5a731b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "760c27af-2c74-5bf1-8cbd-ae975869e026"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "760c27af-2c74-5bf1-8cbd-ae975869e026",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "bde1807e-00b6-54b5-87f7-ef0745695631"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bde1807e-00b6-54b5-87f7-ef0745695631",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "749127f1-0b3d-4fe0-8405-e8efcae6fe48"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "fc2be8be-4a76-442e-acd5-1477d06459d4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f7181078-131a-4d67-92a9-d00b5ace130a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4838f73b-98ad-469c-affc-d1f7e6ace3e0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "b9369573-bfec-47e6-8a0c-e21253fdac95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "03d9aeb4-7541-4713-b407-1f62933a0bbb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "2f7aedea-939b-4742-8dc2-160109056578"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d86ee5a6-5e98-4a5f-91d0-fafcd7039bce",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e8d6cf89-ed34-46bd-8213-5529066353ec",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a681b5b2-d1bd-4ca7-a948-7d5037183869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a681b5b2-d1bd-4ca7-a948-7d5037183869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:12.263 13:26:51 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:12.263 13:26:51 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:12.263 13:26:51 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:12.263 13:26:51 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2047419 00:08:12.263 13:26:51 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2047419 ']' 00:08:12.263 13:26:51 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2047419 00:08:12.263 13:26:51 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2047419 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2047419' 00:08:12.520 killing process with pid 2047419 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@967 -- # kill 2047419 00:08:12.520 13:26:51 blockdev_general -- common/autotest_common.sh@972 -- # wait 2047419 00:08:13.088 13:26:52 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:13.088 13:26:52 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:13.088 13:26:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:13.088 13:26:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.088 13:26:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:13.088 ************************************ 00:08:13.088 START TEST bdev_hello_world 00:08:13.088 ************************************ 00:08:13.088 13:26:52 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:13.088 [2024-07-15 13:26:52.339729] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:13.088 [2024-07-15 13:26:52.339788] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047796 ] 00:08:13.088 [2024-07-15 13:26:52.467658] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.346 [2024-07-15 13:26:52.569348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.346 [2024-07-15 13:26:52.739666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:13.346 [2024-07-15 13:26:52.739733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:13.347 [2024-07-15 13:26:52.739748] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:13.347 [2024-07-15 13:26:52.747673] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:13.347 [2024-07-15 13:26:52.747704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:13.347 [2024-07-15 13:26:52.755684] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:13.347 [2024-07-15 13:26:52.755710] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:13.611 [2024-07-15 13:26:52.829978] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:13.611 [2024-07-15 13:26:52.830032] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:13.611 [2024-07-15 13:26:52.830050] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fb3c0 00:08:13.611 [2024-07-15 13:26:52.830063] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:13.611 [2024-07-15 13:26:52.831485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:13.611 [2024-07-15 13:26:52.831514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:13.611 [2024-07-15 13:26:52.967369] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:13.611 [2024-07-15 13:26:52.967435] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:13.611 [2024-07-15 13:26:52.967490] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:13.611 [2024-07-15 13:26:52.967568] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:13.611 [2024-07-15 13:26:52.967646] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:13.611 [2024-07-15 13:26:52.967676] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:13.611 [2024-07-15 13:26:52.967741] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:13.611 00:08:13.611 [2024-07-15 13:26:52.967782] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:13.869 00:08:13.869 real 0m1.007s 00:08:13.869 user 0m0.661s 00:08:13.869 sys 0m0.303s 00:08:13.869 13:26:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.869 13:26:53 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:13.869 ************************************ 00:08:13.869 END TEST bdev_hello_world 00:08:13.869 ************************************ 00:08:14.127 13:26:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:14.127 13:26:53 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:14.127 13:26:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:14.127 13:26:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.127 13:26:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:14.127 ************************************ 00:08:14.127 START TEST bdev_bounds 00:08:14.127 ************************************ 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2047990 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2047990' 00:08:14.127 Process bdevio pid: 2047990 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2047990 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2047990 ']' 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:14.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:14.127 13:26:53 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:14.127 [2024-07-15 13:26:53.411696] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:14.127 [2024-07-15 13:26:53.411765] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2047990 ] 00:08:14.127 [2024-07-15 13:26:53.542794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:14.385 [2024-07-15 13:26:53.652564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.385 [2024-07-15 13:26:53.652650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:14.385 [2024-07-15 13:26:53.652654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.643 [2024-07-15 13:26:53.812701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:14.643 [2024-07-15 13:26:53.812755] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:14.643 [2024-07-15 13:26:53.812771] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:14.643 [2024-07-15 13:26:53.820713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:14.643 [2024-07-15 13:26:53.820741] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:14.643 [2024-07-15 13:26:53.828723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:14.643 [2024-07-15 13:26:53.828748] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:14.643 [2024-07-15 13:26:53.906116] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:14.643 [2024-07-15 13:26:53.906167] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:14.643 [2024-07-15 13:26:53.906186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d00c0 00:08:14.643 [2024-07-15 13:26:53.906198] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:14.643 [2024-07-15 13:26:53.907648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:14.643 [2024-07-15 13:26:53.907683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:15.209 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:15.209 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:15.210 13:26:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:15.210 I/O targets: 00:08:15.210 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:15.210 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:15.210 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:15.210 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:15.210 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:15.210 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:15.210 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:15.210 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:15.210 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:15.210 00:08:15.210 00:08:15.210 CUnit - A unit testing framework for C - Version 2.1-3 00:08:15.210 http://cunit.sourceforge.net/ 00:08:15.210 00:08:15.210 00:08:15.210 Suite: bdevio tests on: AIO0 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: raid1 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: concat0 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: raid0 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: TestPT 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: Malloc2p7 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: Malloc2p6 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.210 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.210 Test: blockdev writev readv 8 blocks ...passed 00:08:15.210 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.210 Test: blockdev writev readv block ...passed 00:08:15.210 Test: blockdev writev readv size > 128k ...passed 00:08:15.210 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.210 Test: blockdev comparev and writev ...passed 00:08:15.210 Test: blockdev nvme passthru rw ...passed 00:08:15.210 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.210 Test: blockdev nvme admin passthru ...passed 00:08:15.210 Test: blockdev copy ...passed 00:08:15.210 Suite: bdevio tests on: Malloc2p5 00:08:15.210 Test: blockdev write read block ...passed 00:08:15.210 Test: blockdev write zeroes read block ...passed 00:08:15.210 Test: blockdev write zeroes read no split ...passed 00:08:15.210 Test: blockdev write zeroes read split ...passed 00:08:15.210 Test: blockdev write zeroes read split partial ...passed 00:08:15.210 Test: blockdev reset ...passed 00:08:15.210 Test: blockdev write read 8 blocks ...passed 00:08:15.210 Test: blockdev write read size > 128k ...passed 00:08:15.210 Test: blockdev write read invalid size ...passed 00:08:15.210 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.210 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.210 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc2p4 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc2p3 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc2p2 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc2p1 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc2p0 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc1p1 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc1p0 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.211 Test: blockdev write zeroes read split partial ...passed 00:08:15.211 Test: blockdev reset ...passed 00:08:15.211 Test: blockdev write read 8 blocks ...passed 00:08:15.211 Test: blockdev write read size > 128k ...passed 00:08:15.211 Test: blockdev write read invalid size ...passed 00:08:15.211 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.211 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.211 Test: blockdev write read max offset ...passed 00:08:15.211 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.211 Test: blockdev writev readv 8 blocks ...passed 00:08:15.211 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.211 Test: blockdev writev readv block ...passed 00:08:15.211 Test: blockdev writev readv size > 128k ...passed 00:08:15.211 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.211 Test: blockdev comparev and writev ...passed 00:08:15.211 Test: blockdev nvme passthru rw ...passed 00:08:15.211 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.211 Test: blockdev nvme admin passthru ...passed 00:08:15.211 Test: blockdev copy ...passed 00:08:15.211 Suite: bdevio tests on: Malloc0 00:08:15.211 Test: blockdev write read block ...passed 00:08:15.211 Test: blockdev write zeroes read block ...passed 00:08:15.211 Test: blockdev write zeroes read no split ...passed 00:08:15.211 Test: blockdev write zeroes read split ...passed 00:08:15.509 Test: blockdev write zeroes read split partial ...passed 00:08:15.509 Test: blockdev reset ...passed 00:08:15.509 Test: blockdev write read 8 blocks ...passed 00:08:15.509 Test: blockdev write read size > 128k ...passed 00:08:15.509 Test: blockdev write read invalid size ...passed 00:08:15.509 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:15.509 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:15.509 Test: blockdev write read max offset ...passed 00:08:15.509 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:15.509 Test: blockdev writev readv 8 blocks ...passed 00:08:15.509 Test: blockdev writev readv 30 x 1block ...passed 00:08:15.509 Test: blockdev writev readv block ...passed 00:08:15.509 Test: blockdev writev readv size > 128k ...passed 00:08:15.509 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:15.509 Test: blockdev comparev and writev ...passed 00:08:15.509 Test: blockdev nvme passthru rw ...passed 00:08:15.509 Test: blockdev nvme passthru vendor specific ...passed 00:08:15.509 Test: blockdev nvme admin passthru ...passed 00:08:15.509 Test: blockdev copy ...passed 00:08:15.509 00:08:15.510 Run Summary: Type Total Ran Passed Failed Inactive 00:08:15.510 suites 16 16 n/a 0 0 00:08:15.510 tests 368 368 368 0 0 00:08:15.510 asserts 2224 2224 2224 0 n/a 00:08:15.510 00:08:15.510 Elapsed time = 0.497 seconds 00:08:15.510 0 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2047990 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2047990 ']' 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2047990 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2047990 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2047990' 00:08:15.510 killing process with pid 2047990 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2047990 00:08:15.510 13:26:54 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2047990 00:08:15.768 13:26:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:15.768 00:08:15.768 real 0m1.692s 00:08:15.768 user 0m4.146s 00:08:15.768 sys 0m0.507s 00:08:15.768 13:26:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.768 13:26:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:15.768 ************************************ 00:08:15.768 END TEST bdev_bounds 00:08:15.768 ************************************ 00:08:15.768 13:26:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:15.768 13:26:55 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:15.768 13:26:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:15.768 13:26:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.768 13:26:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:15.768 ************************************ 00:08:15.768 START TEST bdev_nbd 00:08:15.768 ************************************ 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2048197 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2048197 /var/tmp/spdk-nbd.sock 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2048197 ']' 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:15.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:15.768 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:15.769 13:26:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:16.026 [2024-07-15 13:26:55.204835] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:08:16.026 [2024-07-15 13:26:55.204906] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:16.026 [2024-07-15 13:26:55.339071] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.026 [2024-07-15 13:26:55.443722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.296 [2024-07-15 13:26:55.597599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:16.296 [2024-07-15 13:26:55.597659] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:16.296 [2024-07-15 13:26:55.597675] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:16.296 [2024-07-15 13:26:55.605608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:16.296 [2024-07-15 13:26:55.605636] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:16.296 [2024-07-15 13:26:55.613618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:16.296 [2024-07-15 13:26:55.613643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:16.296 [2024-07-15 13:26:55.685959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:16.296 [2024-07-15 13:26:55.686017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:16.296 [2024-07-15 13:26:55.686034] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f36a40 00:08:16.296 [2024-07-15 13:26:55.686047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:16.296 [2024-07-15 13:26:55.687507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:16.296 [2024-07-15 13:26:55.687536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:16.867 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:17.125 1+0 records in 00:08:17.125 1+0 records out 00:08:17.125 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252738 s, 16.2 MB/s 00:08:17.125 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:17.126 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:17.386 1+0 records in 00:08:17.386 1+0 records out 00:08:17.386 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231169 s, 17.7 MB/s 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:17.386 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:17.645 1+0 records in 00:08:17.645 1+0 records out 00:08:17.645 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330437 s, 12.4 MB/s 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:17.645 13:26:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:17.903 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:17.904 1+0 records in 00:08:17.904 1+0 records out 00:08:17.904 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341506 s, 12.0 MB/s 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:17.904 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:18.162 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:18.163 1+0 records in 00:08:18.163 1+0 records out 00:08:18.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267697 s, 15.3 MB/s 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:18.163 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:18.421 1+0 records in 00:08:18.421 1+0 records out 00:08:18.421 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329507 s, 12.4 MB/s 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:18.421 13:26:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:18.680 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:18.681 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:18.681 1+0 records in 00:08:18.681 1+0 records out 00:08:18.681 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000476739 s, 8.6 MB/s 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:18.939 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:19.198 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:19.199 1+0 records in 00:08:19.199 1+0 records out 00:08:19.199 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000412852 s, 9.9 MB/s 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.199 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:19.457 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:19.458 1+0 records in 00:08:19.458 1+0 records out 00:08:19.458 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000394366 s, 10.4 MB/s 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.458 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:19.717 1+0 records in 00:08:19.717 1+0 records out 00:08:19.717 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536349 s, 7.6 MB/s 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.717 13:26:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:19.976 1+0 records in 00:08:19.976 1+0 records out 00:08:19.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000614433 s, 6.7 MB/s 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:19.976 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.235 1+0 records in 00:08:20.235 1+0 records out 00:08:20.235 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000501206 s, 8.2 MB/s 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.235 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.495 1+0 records in 00:08:20.495 1+0 records out 00:08:20.495 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000580344 s, 7.1 MB/s 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.495 13:26:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:20.754 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:20.754 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:20.754 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:20.755 1+0 records in 00:08:20.755 1+0 records out 00:08:20.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000495575 s, 8.3 MB/s 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:20.755 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.015 1+0 records in 00:08:21.015 1+0 records out 00:08:21.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000796045 s, 5.1 MB/s 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.015 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:21.275 1+0 records in 00:08:21.275 1+0 records out 00:08:21.275 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000623797 s, 6.6 MB/s 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:21.275 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:21.535 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd0", 00:08:21.535 "bdev_name": "Malloc0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd1", 00:08:21.535 "bdev_name": "Malloc1p0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd2", 00:08:21.535 "bdev_name": "Malloc1p1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd3", 00:08:21.535 "bdev_name": "Malloc2p0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd4", 00:08:21.535 "bdev_name": "Malloc2p1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd5", 00:08:21.535 "bdev_name": "Malloc2p2" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd6", 00:08:21.535 "bdev_name": "Malloc2p3" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd7", 00:08:21.535 "bdev_name": "Malloc2p4" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd8", 00:08:21.535 "bdev_name": "Malloc2p5" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd9", 00:08:21.535 "bdev_name": "Malloc2p6" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd10", 00:08:21.535 "bdev_name": "Malloc2p7" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd11", 00:08:21.535 "bdev_name": "TestPT" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd12", 00:08:21.535 "bdev_name": "raid0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd13", 00:08:21.535 "bdev_name": "concat0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd14", 00:08:21.535 "bdev_name": "raid1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd15", 00:08:21.535 "bdev_name": "AIO0" 00:08:21.535 } 00:08:21.535 ]' 00:08:21.535 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:21.535 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd0", 00:08:21.535 "bdev_name": "Malloc0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd1", 00:08:21.535 "bdev_name": "Malloc1p0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd2", 00:08:21.535 "bdev_name": "Malloc1p1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd3", 00:08:21.535 "bdev_name": "Malloc2p0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd4", 00:08:21.535 "bdev_name": "Malloc2p1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd5", 00:08:21.535 "bdev_name": "Malloc2p2" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd6", 00:08:21.535 "bdev_name": "Malloc2p3" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd7", 00:08:21.535 "bdev_name": "Malloc2p4" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd8", 00:08:21.535 "bdev_name": "Malloc2p5" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd9", 00:08:21.535 "bdev_name": "Malloc2p6" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd10", 00:08:21.535 "bdev_name": "Malloc2p7" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd11", 00:08:21.535 "bdev_name": "TestPT" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd12", 00:08:21.535 "bdev_name": "raid0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd13", 00:08:21.535 "bdev_name": "concat0" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd14", 00:08:21.535 "bdev_name": "raid1" 00:08:21.535 }, 00:08:21.535 { 00:08:21.535 "nbd_device": "/dev/nbd15", 00:08:21.535 "bdev_name": "AIO0" 00:08:21.535 } 00:08:21.535 ]' 00:08:21.535 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:21.535 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:21.536 13:27:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:21.795 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.054 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.313 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.572 13:27:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:22.832 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:23.090 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:23.348 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:23.611 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:23.611 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:23.611 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:23.611 13:27:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:23.932 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.226 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.484 13:27:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.743 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.001 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.270 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.532 13:27:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:25.790 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:26.049 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:26.049 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:26.049 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:26.308 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:26.566 /dev/nbd0 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:26.566 1+0 records in 00:08:26.566 1+0 records out 00:08:26.566 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275012 s, 14.9 MB/s 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:26.566 /dev/nbd1 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:26.566 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:26.824 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:26.824 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:26.824 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:26.824 1+0 records in 00:08:26.824 1+0 records out 00:08:26.824 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320686 s, 12.8 MB/s 00:08:26.824 13:27:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:26.824 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:27.083 /dev/nbd10 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.083 1+0 records in 00:08:27.083 1+0 records out 00:08:27.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309206 s, 13.2 MB/s 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:27.083 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:27.083 /dev/nbd11 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.342 1+0 records in 00:08:27.342 1+0 records out 00:08:27.342 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000365495 s, 11.2 MB/s 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:27.342 /dev/nbd12 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.342 1+0 records in 00:08:27.342 1+0 records out 00:08:27.342 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000402796 s, 10.2 MB/s 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:27.342 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:27.601 /dev/nbd13 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.601 1+0 records in 00:08:27.601 1+0 records out 00:08:27.601 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000380676 s, 10.8 MB/s 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:27.601 13:27:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:27.860 /dev/nbd14 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.860 1+0 records in 00:08:27.860 1+0 records out 00:08:27.860 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418184 s, 9.8 MB/s 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:27.860 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:28.119 /dev/nbd15 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.119 1+0 records in 00:08:28.119 1+0 records out 00:08:28.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000538948 s, 7.6 MB/s 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.119 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:28.119 /dev/nbd2 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.379 1+0 records in 00:08:28.379 1+0 records out 00:08:28.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000524794 s, 7.8 MB/s 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.379 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:28.638 /dev/nbd3 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.638 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.639 1+0 records in 00:08:28.639 1+0 records out 00:08:28.639 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000521597 s, 7.9 MB/s 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.639 13:27:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:28.639 /dev/nbd4 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.639 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.639 1+0 records in 00:08:28.639 1+0 records out 00:08:28.639 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000475605 s, 8.6 MB/s 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:28.897 /dev/nbd5 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.897 1+0 records in 00:08:28.897 1+0 records out 00:08:28.897 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000582856 s, 7.0 MB/s 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:28.897 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:29.156 /dev/nbd6 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.156 1+0 records in 00:08:29.156 1+0 records out 00:08:29.156 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000740183 s, 5.5 MB/s 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.156 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:29.528 /dev/nbd7 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.528 1+0 records in 00:08:29.528 1+0 records out 00:08:29.528 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000681857 s, 6.0 MB/s 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.528 13:27:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:29.787 /dev/nbd8 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.787 1+0 records in 00:08:29.787 1+0 records out 00:08:29.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00111405 s, 3.7 MB/s 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:29.787 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:30.045 /dev/nbd9 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.045 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.046 1+0 records in 00:08:30.046 1+0 records out 00:08:30.046 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000722896 s, 5.7 MB/s 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:30.046 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd0", 00:08:30.304 "bdev_name": "Malloc0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd1", 00:08:30.304 "bdev_name": "Malloc1p0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd10", 00:08:30.304 "bdev_name": "Malloc1p1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd11", 00:08:30.304 "bdev_name": "Malloc2p0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd12", 00:08:30.304 "bdev_name": "Malloc2p1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd13", 00:08:30.304 "bdev_name": "Malloc2p2" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd14", 00:08:30.304 "bdev_name": "Malloc2p3" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd15", 00:08:30.304 "bdev_name": "Malloc2p4" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd2", 00:08:30.304 "bdev_name": "Malloc2p5" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd3", 00:08:30.304 "bdev_name": "Malloc2p6" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd4", 00:08:30.304 "bdev_name": "Malloc2p7" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd5", 00:08:30.304 "bdev_name": "TestPT" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd6", 00:08:30.304 "bdev_name": "raid0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd7", 00:08:30.304 "bdev_name": "concat0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd8", 00:08:30.304 "bdev_name": "raid1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd9", 00:08:30.304 "bdev_name": "AIO0" 00:08:30.304 } 00:08:30.304 ]' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd0", 00:08:30.304 "bdev_name": "Malloc0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd1", 00:08:30.304 "bdev_name": "Malloc1p0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd10", 00:08:30.304 "bdev_name": "Malloc1p1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd11", 00:08:30.304 "bdev_name": "Malloc2p0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd12", 00:08:30.304 "bdev_name": "Malloc2p1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd13", 00:08:30.304 "bdev_name": "Malloc2p2" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd14", 00:08:30.304 "bdev_name": "Malloc2p3" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd15", 00:08:30.304 "bdev_name": "Malloc2p4" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd2", 00:08:30.304 "bdev_name": "Malloc2p5" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd3", 00:08:30.304 "bdev_name": "Malloc2p6" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd4", 00:08:30.304 "bdev_name": "Malloc2p7" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd5", 00:08:30.304 "bdev_name": "TestPT" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd6", 00:08:30.304 "bdev_name": "raid0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd7", 00:08:30.304 "bdev_name": "concat0" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd8", 00:08:30.304 "bdev_name": "raid1" 00:08:30.304 }, 00:08:30.304 { 00:08:30.304 "nbd_device": "/dev/nbd9", 00:08:30.304 "bdev_name": "AIO0" 00:08:30.304 } 00:08:30.304 ]' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:30.304 /dev/nbd1 00:08:30.304 /dev/nbd10 00:08:30.304 /dev/nbd11 00:08:30.304 /dev/nbd12 00:08:30.304 /dev/nbd13 00:08:30.304 /dev/nbd14 00:08:30.304 /dev/nbd15 00:08:30.304 /dev/nbd2 00:08:30.304 /dev/nbd3 00:08:30.304 /dev/nbd4 00:08:30.304 /dev/nbd5 00:08:30.304 /dev/nbd6 00:08:30.304 /dev/nbd7 00:08:30.304 /dev/nbd8 00:08:30.304 /dev/nbd9' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:30.304 /dev/nbd1 00:08:30.304 /dev/nbd10 00:08:30.304 /dev/nbd11 00:08:30.304 /dev/nbd12 00:08:30.304 /dev/nbd13 00:08:30.304 /dev/nbd14 00:08:30.304 /dev/nbd15 00:08:30.304 /dev/nbd2 00:08:30.304 /dev/nbd3 00:08:30.304 /dev/nbd4 00:08:30.304 /dev/nbd5 00:08:30.304 /dev/nbd6 00:08:30.304 /dev/nbd7 00:08:30.304 /dev/nbd8 00:08:30.304 /dev/nbd9' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:30.304 256+0 records in 00:08:30.304 256+0 records out 00:08:30.304 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105406 s, 99.5 MB/s 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:30.304 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:30.563 256+0 records in 00:08:30.563 256+0 records out 00:08:30.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181986 s, 5.8 MB/s 00:08:30.563 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:30.563 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:30.563 256+0 records in 00:08:30.563 256+0 records out 00:08:30.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183717 s, 5.7 MB/s 00:08:30.563 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:30.563 13:27:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:30.821 256+0 records in 00:08:30.821 256+0 records out 00:08:30.821 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184845 s, 5.7 MB/s 00:08:30.821 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:30.822 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:31.080 256+0 records in 00:08:31.080 256+0 records out 00:08:31.080 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183755 s, 5.7 MB/s 00:08:31.080 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.080 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:31.339 256+0 records in 00:08:31.339 256+0 records out 00:08:31.339 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183465 s, 5.7 MB/s 00:08:31.339 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.339 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:31.339 256+0 records in 00:08:31.339 256+0 records out 00:08:31.339 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.133082 s, 7.9 MB/s 00:08:31.339 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.339 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:31.597 256+0 records in 00:08:31.597 256+0 records out 00:08:31.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183593 s, 5.7 MB/s 00:08:31.597 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.597 13:27:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:31.854 256+0 records in 00:08:31.854 256+0 records out 00:08:31.854 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183741 s, 5.7 MB/s 00:08:31.854 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.854 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:31.854 256+0 records in 00:08:31.854 256+0 records out 00:08:31.854 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183692 s, 5.7 MB/s 00:08:31.854 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:31.854 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:32.112 256+0 records in 00:08:32.112 256+0 records out 00:08:32.112 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184225 s, 5.7 MB/s 00:08:32.112 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.112 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:32.376 256+0 records in 00:08:32.376 256+0 records out 00:08:32.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.134684 s, 7.8 MB/s 00:08:32.376 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.376 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:32.376 256+0 records in 00:08:32.376 256+0 records out 00:08:32.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184153 s, 5.7 MB/s 00:08:32.376 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.376 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:32.632 256+0 records in 00:08:32.632 256+0 records out 00:08:32.632 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184373 s, 5.7 MB/s 00:08:32.632 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.632 13:27:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:32.889 256+0 records in 00:08:32.889 256+0 records out 00:08:32.889 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184919 s, 5.7 MB/s 00:08:32.889 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:32.889 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:33.146 256+0 records in 00:08:33.146 256+0 records out 00:08:33.146 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.187945 s, 5.6 MB/s 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:33.146 256+0 records in 00:08:33.146 256+0 records out 00:08:33.146 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182351 s, 5.8 MB/s 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.146 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.403 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.661 13:27:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.918 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.174 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:34.431 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.432 13:27:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.689 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.947 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.204 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.462 13:27:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.720 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.978 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.236 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:36.237 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.237 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.237 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.237 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.494 13:27:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:36.752 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:37.010 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:37.270 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:37.528 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:37.788 13:27:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:38.047 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:38.306 malloc_lvol_verify 00:08:38.306 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:38.566 27dfffc4-8fde-4e56-996e-e418b0aa6aa8 00:08:38.566 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:38.566 48ebcf67-0f77-4ba8-aa4c-7301c2940652 00:08:38.825 13:27:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:38.825 /dev/nbd0 00:08:39.084 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:39.084 mke2fs 1.46.5 (30-Dec-2021) 00:08:39.084 Discarding device blocks: 0/4096 done 00:08:39.084 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:39.084 00:08:39.084 Allocating group tables: 0/1 done 00:08:39.084 Writing inode tables: 0/1 done 00:08:39.084 Creating journal (1024 blocks): done 00:08:39.084 Writing superblocks and filesystem accounting information: 0/1 done 00:08:39.084 00:08:39.084 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:39.085 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:39.344 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2048197 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2048197 ']' 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2048197 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2048197 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2048197' 00:08:39.345 killing process with pid 2048197 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2048197 00:08:39.345 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2048197 00:08:39.604 13:27:18 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:39.604 00:08:39.604 real 0m23.780s 00:08:39.604 user 0m29.063s 00:08:39.604 sys 0m14.024s 00:08:39.604 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.604 13:27:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:39.604 ************************************ 00:08:39.604 END TEST bdev_nbd 00:08:39.604 ************************************ 00:08:39.604 13:27:18 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:39.604 13:27:18 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:39.604 13:27:18 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:39.604 13:27:18 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:39.604 13:27:18 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:39.604 13:27:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:39.604 13:27:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.604 13:27:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:39.604 ************************************ 00:08:39.604 START TEST bdev_fio 00:08:39.604 ************************************ 00:08:39.604 13:27:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:08:39.604 13:27:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:39.604 13:27:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:39.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:39.604 13:27:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:39.604 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:08:39.605 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.864 13:27:19 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:39.864 ************************************ 00:08:39.864 START TEST bdev_fio_rw_verify 00:08:39.864 ************************************ 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:39.864 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:39.865 13:27:19 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:40.124 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:40.124 fio-3.35 00:08:40.124 Starting 16 threads 00:08:52.323 00:08:52.323 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2052712: Mon Jul 15 13:27:30 2024 00:08:52.323 read: IOPS=88.1k, BW=344MiB/s (361MB/s)(3442MiB/10001msec) 00:08:52.323 slat (usec): min=2, max=357, avg=36.41, stdev=14.55 00:08:52.323 clat (usec): min=10, max=1507, avg=297.61, stdev=135.28 00:08:52.323 lat (usec): min=25, max=1577, avg=334.02, stdev=143.53 00:08:52.323 clat percentiles (usec): 00:08:52.323 | 50.000th=[ 289], 99.000th=[ 635], 99.900th=[ 807], 99.990th=[ 1303], 00:08:52.323 | 99.999th=[ 1434] 00:08:52.323 write: IOPS=139k, BW=542MiB/s (568MB/s)(5340MiB/9860msec); 0 zone resets 00:08:52.323 slat (usec): min=8, max=3948, avg=49.75, stdev=18.63 00:08:52.323 clat (usec): min=12, max=4352, avg=351.62, stdev=173.05 00:08:52.323 lat (usec): min=41, max=4396, avg=401.37, stdev=184.30 00:08:52.323 clat percentiles (usec): 00:08:52.323 | 50.000th=[ 330], 99.000th=[ 988], 99.900th=[ 1418], 99.990th=[ 1516], 00:08:52.323 | 99.999th=[ 1778] 00:08:52.323 bw ( KiB/s): min=467624, max=728694, per=98.90%, avg=548485.37, stdev=4287.42, samples=304 00:08:52.323 iops : min=116906, max=182171, avg=137121.11, stdev=1071.85, samples=304 00:08:52.323 lat (usec) : 20=0.01%, 50=0.37%, 100=3.80%, 250=30.29%, 500=52.36% 00:08:52.323 lat (usec) : 750=12.09%, 1000=0.50% 00:08:52.323 lat (msec) : 2=0.59%, 10=0.01% 00:08:52.323 cpu : usr=99.22%, sys=0.38%, ctx=610, majf=0, minf=1999 00:08:52.323 IO depths : 1=12.5%, 2=24.9%, 4=50.1%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:52.323 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:52.323 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:52.323 issued rwts: total=881196,1367078,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:52.323 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:52.323 00:08:52.323 Run status group 0 (all jobs): 00:08:52.323 READ: bw=344MiB/s (361MB/s), 344MiB/s-344MiB/s (361MB/s-361MB/s), io=3442MiB (3609MB), run=10001-10001msec 00:08:52.323 WRITE: bw=542MiB/s (568MB/s), 542MiB/s-542MiB/s (568MB/s-568MB/s), io=5340MiB (5600MB), run=9860-9860msec 00:08:52.323 00:08:52.323 real 0m11.655s 00:08:52.323 user 2m44.955s 00:08:52.323 sys 0m1.310s 00:08:52.323 13:27:30 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:52.323 13:27:30 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:52.323 ************************************ 00:08:52.323 END TEST bdev_fio_rw_verify 00:08:52.323 ************************************ 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:08:52.323 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:52.325 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "26ffaa6f-666b-4317-bfd1-f2b64e05cac4"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "26ffaa6f-666b-4317-bfd1-f2b64e05cac4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "a7dba23a-8c7e-583d-abf3-26648a811565"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a7dba23a-8c7e-583d-abf3-26648a811565",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "3dd72c1d-4b48-5952-8451-a65e8c07a6e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3dd72c1d-4b48-5952-8451-a65e8c07a6e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3bc26509-470c-5ad5-bab0-c1814b88a99f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3bc26509-470c-5ad5-bab0-c1814b88a99f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "72d35a18-5d13-56a4-8fbf-60b23cf0375f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "72d35a18-5d13-56a4-8fbf-60b23cf0375f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "88b8e797-97ee-5460-a6ca-b28e077538c0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "88b8e797-97ee-5460-a6ca-b28e077538c0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "e799c281-7140-5f8d-bf44-c1d2f67d10df"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e799c281-7140-5f8d-bf44-c1d2f67d10df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f21b2f8f-ac17-5e44-8616-2818c301c778"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f21b2f8f-ac17-5e44-8616-2818c301c778",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "94d0730b-58a4-53f2-a681-9d59ded144b5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "94d0730b-58a4-53f2-a681-9d59ded144b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6d8b9fe5-4c76-545f-b848-0df8de5a731b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6d8b9fe5-4c76-545f-b848-0df8de5a731b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "760c27af-2c74-5bf1-8cbd-ae975869e026"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "760c27af-2c74-5bf1-8cbd-ae975869e026",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "bde1807e-00b6-54b5-87f7-ef0745695631"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bde1807e-00b6-54b5-87f7-ef0745695631",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "749127f1-0b3d-4fe0-8405-e8efcae6fe48"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "fc2be8be-4a76-442e-acd5-1477d06459d4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f7181078-131a-4d67-92a9-d00b5ace130a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4838f73b-98ad-469c-affc-d1f7e6ace3e0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "b9369573-bfec-47e6-8a0c-e21253fdac95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "03d9aeb4-7541-4713-b407-1f62933a0bbb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "2f7aedea-939b-4742-8dc2-160109056578"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d86ee5a6-5e98-4a5f-91d0-fafcd7039bce",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e8d6cf89-ed34-46bd-8213-5529066353ec",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a681b5b2-d1bd-4ca7-a948-7d5037183869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a681b5b2-d1bd-4ca7-a948-7d5037183869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:52.325 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:52.325 Malloc1p0 00:08:52.325 Malloc1p1 00:08:52.325 Malloc2p0 00:08:52.325 Malloc2p1 00:08:52.325 Malloc2p2 00:08:52.325 Malloc2p3 00:08:52.325 Malloc2p4 00:08:52.325 Malloc2p5 00:08:52.325 Malloc2p6 00:08:52.325 Malloc2p7 00:08:52.325 TestPT 00:08:52.325 raid0 00:08:52.325 concat0 ]] 00:08:52.325 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "26ffaa6f-666b-4317-bfd1-f2b64e05cac4"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "26ffaa6f-666b-4317-bfd1-f2b64e05cac4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "a7dba23a-8c7e-583d-abf3-26648a811565"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "a7dba23a-8c7e-583d-abf3-26648a811565",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "3dd72c1d-4b48-5952-8451-a65e8c07a6e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3dd72c1d-4b48-5952-8451-a65e8c07a6e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "3bc26509-470c-5ad5-bab0-c1814b88a99f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3bc26509-470c-5ad5-bab0-c1814b88a99f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "72d35a18-5d13-56a4-8fbf-60b23cf0375f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "72d35a18-5d13-56a4-8fbf-60b23cf0375f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "88b8e797-97ee-5460-a6ca-b28e077538c0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "88b8e797-97ee-5460-a6ca-b28e077538c0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "e799c281-7140-5f8d-bf44-c1d2f67d10df"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e799c281-7140-5f8d-bf44-c1d2f67d10df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "f21b2f8f-ac17-5e44-8616-2818c301c778"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f21b2f8f-ac17-5e44-8616-2818c301c778",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "94d0730b-58a4-53f2-a681-9d59ded144b5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "94d0730b-58a4-53f2-a681-9d59ded144b5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "6d8b9fe5-4c76-545f-b848-0df8de5a731b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6d8b9fe5-4c76-545f-b848-0df8de5a731b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "760c27af-2c74-5bf1-8cbd-ae975869e026"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "760c27af-2c74-5bf1-8cbd-ae975869e026",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "bde1807e-00b6-54b5-87f7-ef0745695631"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bde1807e-00b6-54b5-87f7-ef0745695631",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "749127f1-0b3d-4fe0-8405-e8efcae6fe48"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "749127f1-0b3d-4fe0-8405-e8efcae6fe48",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "fc2be8be-4a76-442e-acd5-1477d06459d4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "f7181078-131a-4d67-92a9-d00b5ace130a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4838f73b-98ad-469c-affc-d1f7e6ace3e0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4838f73b-98ad-469c-affc-d1f7e6ace3e0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "b9369573-bfec-47e6-8a0c-e21253fdac95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "03d9aeb4-7541-4713-b407-1f62933a0bbb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "2f7aedea-939b-4742-8dc2-160109056578"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2f7aedea-939b-4742-8dc2-160109056578",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "d86ee5a6-5e98-4a5f-91d0-fafcd7039bce",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e8d6cf89-ed34-46bd-8213-5529066353ec",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a681b5b2-d1bd-4ca7-a948-7d5037183869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a681b5b2-d1bd-4ca7-a948-7d5037183869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:52.326 13:27:30 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:52.326 ************************************ 00:08:52.326 START TEST bdev_fio_trim 00:08:52.326 ************************************ 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:08:52.326 13:27:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:52.326 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:52.326 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:52.327 13:27:31 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:52.327 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:52.327 fio-3.35 00:08:52.327 Starting 14 threads 00:09:04.556 00:09:04.556 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2054408: Mon Jul 15 13:27:42 2024 00:09:04.556 write: IOPS=120k, BW=470MiB/s (493MB/s)(4703MiB/10001msec); 0 zone resets 00:09:04.556 slat (usec): min=2, max=638, avg=40.74, stdev=12.31 00:09:04.556 clat (usec): min=31, max=3590, avg=290.76, stdev=106.56 00:09:04.556 lat (usec): min=40, max=3611, avg=331.49, stdev=112.56 00:09:04.556 clat percentiles (usec): 00:09:04.556 | 50.000th=[ 277], 99.000th=[ 553], 99.900th=[ 611], 99.990th=[ 742], 00:09:04.556 | 99.999th=[ 988] 00:09:04.556 bw ( KiB/s): min=404320, max=710530, per=100.00%, avg=483461.16, stdev=6112.83, samples=266 00:09:04.556 iops : min=101080, max=177630, avg=120865.05, stdev=1528.18, samples=266 00:09:04.556 trim: IOPS=120k, BW=470MiB/s (493MB/s)(4703MiB/10001msec); 0 zone resets 00:09:04.556 slat (usec): min=5, max=1179, avg=27.75, stdev= 7.92 00:09:04.556 clat (usec): min=4, max=3611, avg=331.69, stdev=112.59 00:09:04.556 lat (usec): min=13, max=3629, avg=359.44, stdev=116.66 00:09:04.556 clat percentiles (usec): 00:09:04.556 | 50.000th=[ 318], 99.000th=[ 603], 99.900th=[ 668], 99.990th=[ 807], 00:09:04.556 | 99.999th=[ 1074] 00:09:04.556 bw ( KiB/s): min=404320, max=710530, per=100.00%, avg=483460.74, stdev=6112.84, samples=266 00:09:04.556 iops : min=101080, max=177630, avg=120865.16, stdev=1528.18, samples=266 00:09:04.556 lat (usec) : 10=0.01%, 20=0.01%, 50=0.01%, 100=0.66%, 250=32.29% 00:09:04.556 lat (usec) : 500=60.70%, 750=6.32%, 1000=0.02% 00:09:04.556 lat (msec) : 2=0.01%, 4=0.01% 00:09:04.556 cpu : usr=99.57%, sys=0.00%, ctx=709, majf=0, minf=1012 00:09:04.556 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:04.556 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:04.556 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:04.556 issued rwts: total=0,1203940,1203943,0 short=0,0,0,0 dropped=0,0,0,0 00:09:04.556 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:04.556 00:09:04.556 Run status group 0 (all jobs): 00:09:04.556 WRITE: bw=470MiB/s (493MB/s), 470MiB/s-470MiB/s (493MB/s-493MB/s), io=4703MiB (4931MB), run=10001-10001msec 00:09:04.556 TRIM: bw=470MiB/s (493MB/s), 470MiB/s-470MiB/s (493MB/s-493MB/s), io=4703MiB (4931MB), run=10001-10001msec 00:09:04.556 00:09:04.556 real 0m11.779s 00:09:04.556 user 2m25.670s 00:09:04.556 sys 0m1.168s 00:09:04.556 13:27:42 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.556 13:27:42 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:04.556 ************************************ 00:09:04.556 END TEST bdev_fio_trim 00:09:04.556 ************************************ 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:04.556 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:04.556 00:09:04.556 real 0m23.821s 00:09:04.556 user 5m10.828s 00:09:04.556 sys 0m2.693s 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.556 13:27:42 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:04.556 ************************************ 00:09:04.556 END TEST bdev_fio 00:09:04.556 ************************************ 00:09:04.556 13:27:42 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:04.556 13:27:42 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:04.556 13:27:42 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:04.556 13:27:42 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:04.556 13:27:42 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.556 13:27:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.556 ************************************ 00:09:04.556 START TEST bdev_verify 00:09:04.556 ************************************ 00:09:04.556 13:27:42 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:04.556 [2024-07-15 13:27:42.958124] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:04.556 [2024-07-15 13:27:42.958189] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2055858 ] 00:09:04.556 [2024-07-15 13:27:43.086744] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:04.556 [2024-07-15 13:27:43.190790] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.556 [2024-07-15 13:27:43.190797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.556 [2024-07-15 13:27:43.344494] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.556 [2024-07-15 13:27:43.344547] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:04.556 [2024-07-15 13:27:43.344563] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:04.556 [2024-07-15 13:27:43.352500] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.556 [2024-07-15 13:27:43.352527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.556 [2024-07-15 13:27:43.360512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.556 [2024-07-15 13:27:43.360535] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.556 [2024-07-15 13:27:43.437610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.556 [2024-07-15 13:27:43.437664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:04.556 [2024-07-15 13:27:43.437683] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab94d0 00:09:04.556 [2024-07-15 13:27:43.437696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:04.556 [2024-07-15 13:27:43.439331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:04.556 [2024-07-15 13:27:43.439369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:04.556 Running I/O for 5 seconds... 00:09:09.823 00:09:09.823 Latency(us) 00:09:09.823 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:09.823 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x1000 00:09:09.823 Malloc0 : 5.20 1058.44 4.13 0.00 0.00 120670.00 733.72 269894.34 00:09:09.823 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x1000 length 0x1000 00:09:09.823 Malloc0 : 5.19 1036.58 4.05 0.00 0.00 123207.39 537.82 428548.45 00:09:09.823 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x800 00:09:09.823 Malloc1p0 : 5.21 540.93 2.11 0.00 0.00 235161.68 3675.71 253481.85 00:09:09.823 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x800 length 0x800 00:09:09.823 Malloc1p0 : 5.19 542.74 2.12 0.00 0.00 234426.78 3647.22 238892.97 00:09:09.823 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x800 00:09:09.823 Malloc1p1 : 5.21 540.33 2.11 0.00 0.00 234623.52 3604.48 248011.02 00:09:09.823 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x800 length 0x800 00:09:09.823 Malloc1p1 : 5.19 542.50 2.12 0.00 0.00 233707.90 3575.99 235245.75 00:09:09.823 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p0 : 5.22 539.75 2.11 0.00 0.00 234131.63 3561.74 244363.80 00:09:09.823 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p0 : 5.19 542.27 2.12 0.00 0.00 233046.14 3561.74 229774.91 00:09:09.823 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p1 : 5.22 539.22 2.11 0.00 0.00 233591.84 3618.73 237069.36 00:09:09.823 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p1 : 5.20 542.02 2.12 0.00 0.00 232363.71 3618.73 225215.89 00:09:09.823 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p2 : 5.22 539.00 2.11 0.00 0.00 232917.89 3519.00 232510.33 00:09:09.823 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p2 : 5.20 541.78 2.12 0.00 0.00 231700.55 3519.00 218833.25 00:09:09.823 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p3 : 5.23 538.78 2.10 0.00 0.00 232240.68 3561.74 228863.11 00:09:09.823 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p3 : 5.20 541.53 2.12 0.00 0.00 231036.84 3561.74 214274.23 00:09:09.823 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p4 : 5.23 538.56 2.10 0.00 0.00 231598.09 3561.74 224304.08 00:09:09.823 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p4 : 5.21 540.92 2.11 0.00 0.00 230544.10 3561.74 209715.20 00:09:09.823 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p5 : 5.23 538.34 2.10 0.00 0.00 230976.63 3476.26 221568.67 00:09:09.823 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x200 length 0x200 00:09:09.823 Malloc2p5 : 5.21 540.33 2.11 0.00 0.00 230104.14 3447.76 206067.98 00:09:09.823 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.823 Verification LBA range: start 0x0 length 0x200 00:09:09.823 Malloc2p6 : 5.23 538.12 2.10 0.00 0.00 230375.69 3533.25 218833.25 00:09:09.823 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x200 length 0x200 00:09:09.824 Malloc2p6 : 5.22 539.75 2.11 0.00 0.00 229661.75 3547.49 203332.56 00:09:09.824 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x200 00:09:09.824 Malloc2p7 : 5.24 537.90 2.10 0.00 0.00 229766.04 3447.76 213362.42 00:09:09.824 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x200 length 0x200 00:09:09.824 Malloc2p7 : 5.29 556.91 2.18 0.00 0.00 221996.23 3447.76 201508.95 00:09:09.824 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x1000 00:09:09.824 TestPT : 5.29 534.28 2.09 0.00 0.00 230115.29 12195.39 215186.03 00:09:09.824 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x1000 length 0x1000 00:09:09.824 TestPT : 5.29 532.48 2.08 0.00 0.00 231327.02 15386.71 284483.23 00:09:09.824 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x2000 00:09:09.824 raid0 : 5.29 556.35 2.17 0.00 0.00 220599.56 3504.75 190567.29 00:09:09.824 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x2000 length 0x2000 00:09:09.824 raid0 : 5.29 556.44 2.17 0.00 0.00 220633.79 3504.75 174154.80 00:09:09.824 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x2000 00:09:09.824 concat0 : 5.29 556.11 2.17 0.00 0.00 220034.17 3419.27 186008.26 00:09:09.824 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x2000 length 0x2000 00:09:09.824 concat0 : 5.29 556.20 2.17 0.00 0.00 220059.91 3405.02 169595.77 00:09:09.824 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x1000 00:09:09.824 raid1 : 5.30 555.85 2.17 0.00 0.00 219471.64 3932.16 179625.63 00:09:09.824 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x1000 length 0x1000 00:09:09.824 raid1 : 5.30 555.95 2.17 0.00 0.00 219499.81 4017.64 177802.02 00:09:09.824 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x0 length 0x4e2 00:09:09.824 AIO0 : 5.30 555.56 2.17 0.00 0.00 218888.01 1602.78 187831.87 00:09:09.824 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:09.824 Verification LBA range: start 0x4e2 length 0x4e2 00:09:09.824 AIO0 : 5.30 555.77 2.17 0.00 0.00 218876.33 1617.03 184184.65 00:09:09.824 =================================================================================================================== 00:09:09.824 Total : 18431.68 72.00 0.00 0.00 216341.06 537.82 428548.45 00:09:10.083 00:09:10.083 real 0m6.564s 00:09:10.083 user 0m12.157s 00:09:10.083 sys 0m0.409s 00:09:10.083 13:27:49 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.083 13:27:49 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:10.083 ************************************ 00:09:10.083 END TEST bdev_verify 00:09:10.083 ************************************ 00:09:10.083 13:27:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:10.083 13:27:49 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:10.083 13:27:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:10.083 13:27:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.083 13:27:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:10.343 ************************************ 00:09:10.343 START TEST bdev_verify_big_io 00:09:10.343 ************************************ 00:09:10.343 13:27:49 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:10.343 [2024-07-15 13:27:49.592235] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:10.343 [2024-07-15 13:27:49.592295] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2056755 ] 00:09:10.343 [2024-07-15 13:27:49.710919] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:10.602 [2024-07-15 13:27:49.813305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:10.602 [2024-07-15 13:27:49.813311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.602 [2024-07-15 13:27:49.969494] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:10.602 [2024-07-15 13:27:49.969556] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:10.602 [2024-07-15 13:27:49.969571] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:10.602 [2024-07-15 13:27:49.977496] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:10.602 [2024-07-15 13:27:49.977526] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:10.602 [2024-07-15 13:27:49.985527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:10.602 [2024-07-15 13:27:49.985553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:10.860 [2024-07-15 13:27:50.059737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:10.860 [2024-07-15 13:27:50.059788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:10.860 [2024-07-15 13:27:50.059807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa1d4d0 00:09:10.860 [2024-07-15 13:27:50.059819] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:10.860 [2024-07-15 13:27:50.061443] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:10.860 [2024-07-15 13:27:50.061470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:10.860 [2024-07-15 13:27:50.227453] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.228544] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.230325] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.231564] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.233348] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.234557] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.236315] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.238114] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.239100] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.240461] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.241385] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.242755] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.243648] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.245035] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.245919] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:10.860 [2024-07-15 13:27:50.247299] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:10.861 [2024-07-15 13:27:50.268548] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:10.861 [2024-07-15 13:27:50.270376] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:11.119 Running I/O for 5 seconds... 00:09:19.293 00:09:19.293 Latency(us) 00:09:19.293 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:19.293 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x0 length 0x100 00:09:19.293 Malloc0 : 6.08 168.54 10.53 0.00 0.00 745483.10 894.00 2188332.52 00:09:19.293 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x100 length 0x100 00:09:19.293 Malloc0 : 5.69 157.60 9.85 0.00 0.00 796470.95 865.50 2202921.41 00:09:19.293 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x0 length 0x80 00:09:19.293 Malloc1p0 : 6.38 60.15 3.76 0.00 0.00 1977919.24 2977.61 3151198.83 00:09:19.293 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x80 length 0x80 00:09:19.293 Malloc1p0 : 6.18 70.60 4.41 0.00 0.00 1659243.87 3262.55 2640587.91 00:09:19.293 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x0 length 0x80 00:09:19.293 Malloc1p1 : 6.75 37.94 2.37 0.00 0.00 2967341.98 1517.30 5193642.52 00:09:19.293 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x80 length 0x80 00:09:19.293 Malloc1p1 : 6.74 37.96 2.37 0.00 0.00 2939706.89 1510.18 5135286.98 00:09:19.293 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.293 Verification LBA range: start 0x0 length 0x20 00:09:19.293 Malloc2p0 : 6.18 25.87 1.62 0.00 0.00 1102763.02 641.11 1838199.32 00:09:19.293 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p0 : 6.18 25.90 1.62 0.00 0.00 1094992.06 648.24 1881965.97 00:09:19.294 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p1 : 6.19 25.86 1.62 0.00 0.00 1093706.34 644.67 1816315.99 00:09:19.294 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p1 : 6.18 25.89 1.62 0.00 0.00 1085630.63 644.67 1845493.76 00:09:19.294 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p2 : 6.19 25.86 1.62 0.00 0.00 1083985.98 641.11 1787138.23 00:09:19.294 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p2 : 6.18 25.89 1.62 0.00 0.00 1076009.22 658.92 1823610.43 00:09:19.294 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p3 : 6.19 25.85 1.62 0.00 0.00 1074666.20 633.99 1765254.90 00:09:19.294 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p3 : 6.18 25.88 1.62 0.00 0.00 1066679.42 658.92 1794432.67 00:09:19.294 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p4 : 6.19 25.85 1.62 0.00 0.00 1065678.41 648.24 1736077.13 00:09:19.294 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p4 : 6.18 25.88 1.62 0.00 0.00 1057458.90 641.11 1772549.34 00:09:19.294 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p5 : 6.19 25.84 1.62 0.00 0.00 1056264.80 644.67 1714193.81 00:09:19.294 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p5 : 6.26 28.11 1.76 0.00 0.00 974412.45 651.80 1743371.58 00:09:19.294 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p6 : 6.19 25.83 1.61 0.00 0.00 1046489.68 641.11 1685016.04 00:09:19.294 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p6 : 6.26 28.10 1.76 0.00 0.00 965710.91 651.80 1721488.25 00:09:19.294 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x20 00:09:19.294 Malloc2p7 : 6.19 25.83 1.61 0.00 0.00 1037492.72 655.36 1663132.72 00:09:19.294 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x20 length 0x20 00:09:19.294 Malloc2p7 : 6.26 28.10 1.76 0.00 0.00 957716.77 655.36 1699604.93 00:09:19.294 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x100 00:09:19.294 TestPT : 6.81 37.86 2.37 0.00 0.00 2706934.11 89812.81 3997354.07 00:09:19.294 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x100 length 0x100 00:09:19.294 TestPT : 6.81 37.59 2.35 0.00 0.00 2702597.79 102122.18 3968176.31 00:09:19.294 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x200 00:09:19.294 raid0 : 6.82 42.24 2.64 0.00 0.00 2372284.59 1631.28 4668442.71 00:09:19.294 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x200 length 0x200 00:09:19.294 raid0 : 6.82 42.24 2.64 0.00 0.00 2349600.41 1624.15 4580909.41 00:09:19.294 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x200 00:09:19.294 concat0 : 6.75 47.40 2.96 0.00 0.00 2068737.88 1617.03 4493376.11 00:09:19.294 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x200 length 0x200 00:09:19.294 concat0 : 6.75 49.80 3.11 0.00 0.00 1967133.44 1638.40 4435020.58 00:09:19.294 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x100 00:09:19.294 raid1 : 6.75 61.45 3.84 0.00 0.00 1565427.45 2265.27 4318309.51 00:09:19.294 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x100 length 0x100 00:09:19.294 raid1 : 6.81 73.22 4.58 0.00 0.00 1315600.14 2251.02 4259953.98 00:09:19.294 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x0 length 0x4e 00:09:19.294 AIO0 : 6.82 69.18 4.32 0.00 0.00 827948.58 794.27 2582232.38 00:09:19.294 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:19.294 Verification LBA range: start 0x4e length 0x4e 00:09:19.294 AIO0 : 6.82 55.70 3.48 0.00 0.00 1027179.47 829.89 2669765.68 00:09:19.294 =================================================================================================================== 00:09:19.294 Total : 1470.00 91.88 0.00 0.00 1426936.17 633.99 5193642.52 00:09:19.294 00:09:19.294 real 0m8.120s 00:09:19.294 user 0m15.266s 00:09:19.294 sys 0m0.410s 00:09:19.294 13:27:57 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.294 13:27:57 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:19.294 ************************************ 00:09:19.294 END TEST bdev_verify_big_io 00:09:19.294 ************************************ 00:09:19.294 13:27:57 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:19.294 13:27:57 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:19.294 13:27:57 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:19.294 13:27:57 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.294 13:27:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:19.294 ************************************ 00:09:19.294 START TEST bdev_write_zeroes 00:09:19.294 ************************************ 00:09:19.294 13:27:57 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:19.294 [2024-07-15 13:27:57.810593] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:19.294 [2024-07-15 13:27:57.810661] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2057829 ] 00:09:19.294 [2024-07-15 13:27:57.941724] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.294 [2024-07-15 13:27:58.046076] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.294 [2024-07-15 13:27:58.210024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:19.294 [2024-07-15 13:27:58.210088] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:19.294 [2024-07-15 13:27:58.210104] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:19.294 [2024-07-15 13:27:58.218030] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:19.294 [2024-07-15 13:27:58.218057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:19.294 [2024-07-15 13:27:58.226038] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:19.294 [2024-07-15 13:27:58.226062] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:19.294 [2024-07-15 13:27:58.302493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:19.294 [2024-07-15 13:27:58.302551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:19.294 [2024-07-15 13:27:58.302570] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12bac10 00:09:19.294 [2024-07-15 13:27:58.302583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:19.294 [2024-07-15 13:27:58.304081] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:19.294 [2024-07-15 13:27:58.304129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:19.294 Running I/O for 1 seconds... 00:09:20.231 00:09:20.231 Latency(us) 00:09:20.231 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:20.231 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc0 : 1.05 4994.06 19.51 0.00 0.00 25611.44 669.61 43082.80 00:09:20.231 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc1p0 : 1.05 4986.98 19.48 0.00 0.00 25601.92 911.81 42170.99 00:09:20.231 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc1p1 : 1.05 4979.91 19.45 0.00 0.00 25582.46 911.81 41259.19 00:09:20.231 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p0 : 1.06 4972.83 19.43 0.00 0.00 25560.11 911.81 40347.38 00:09:20.231 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p1 : 1.06 4965.89 19.40 0.00 0.00 25538.55 908.24 39435.58 00:09:20.231 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p2 : 1.06 4958.93 19.37 0.00 0.00 25519.72 911.81 38523.77 00:09:20.231 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p3 : 1.06 4951.96 19.34 0.00 0.00 25498.52 908.24 37611.97 00:09:20.231 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p4 : 1.06 4945.04 19.32 0.00 0.00 25478.94 933.18 36700.16 00:09:20.231 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p5 : 1.06 4938.17 19.29 0.00 0.00 25461.83 908.24 35788.35 00:09:20.231 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p6 : 1.06 4931.24 19.26 0.00 0.00 25444.33 908.24 34876.55 00:09:20.231 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 Malloc2p7 : 1.07 4924.41 19.24 0.00 0.00 25425.42 911.81 33964.74 00:09:20.231 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 TestPT : 1.07 4917.57 19.21 0.00 0.00 25403.88 947.42 33280.89 00:09:20.231 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 raid0 : 1.07 4909.68 19.18 0.00 0.00 25370.95 1631.28 31685.23 00:09:20.231 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 concat0 : 1.07 4901.94 19.15 0.00 0.00 25318.49 1631.28 30089.57 00:09:20.231 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 raid1 : 1.07 4892.30 19.11 0.00 0.00 25258.13 2592.95 27468.13 00:09:20.231 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:20.231 AIO0 : 1.07 4886.41 19.09 0.00 0.00 25163.64 1040.03 26328.38 00:09:20.231 =================================================================================================================== 00:09:20.231 Total : 79057.33 308.82 0.00 0.00 25452.40 669.61 43082.80 00:09:20.799 00:09:20.799 real 0m2.248s 00:09:20.799 user 0m1.832s 00:09:20.799 sys 0m0.356s 00:09:20.799 13:27:59 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.799 13:27:59 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:20.799 ************************************ 00:09:20.799 END TEST bdev_write_zeroes 00:09:20.799 ************************************ 00:09:20.799 13:28:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:20.799 13:28:00 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:20.799 13:28:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:20.799 13:28:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.799 13:28:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:20.799 ************************************ 00:09:20.799 START TEST bdev_json_nonenclosed 00:09:20.799 ************************************ 00:09:20.799 13:28:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:20.799 [2024-07-15 13:28:00.144163] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:20.799 [2024-07-15 13:28:00.144232] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058144 ] 00:09:21.058 [2024-07-15 13:28:00.274086] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.058 [2024-07-15 13:28:00.377846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.058 [2024-07-15 13:28:00.377924] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:21.058 [2024-07-15 13:28:00.377951] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:21.058 [2024-07-15 13:28:00.377963] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:21.058 00:09:21.058 real 0m0.400s 00:09:21.058 user 0m0.242s 00:09:21.059 sys 0m0.155s 00:09:21.059 13:28:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:21.059 13:28:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.317 13:28:00 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:21.317 ************************************ 00:09:21.317 END TEST bdev_json_nonenclosed 00:09:21.317 ************************************ 00:09:21.317 13:28:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:21.317 13:28:00 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:21.317 13:28:00 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:21.317 13:28:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:21.317 13:28:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.317 13:28:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:21.317 ************************************ 00:09:21.317 START TEST bdev_json_nonarray 00:09:21.317 ************************************ 00:09:21.317 13:28:00 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:21.317 [2024-07-15 13:28:00.632850] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:21.317 [2024-07-15 13:28:00.632913] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058223 ] 00:09:21.576 [2024-07-15 13:28:00.763358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.576 [2024-07-15 13:28:00.861340] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.576 [2024-07-15 13:28:00.861409] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:21.576 [2024-07-15 13:28:00.861431] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:21.576 [2024-07-15 13:28:00.861443] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:21.576 00:09:21.576 real 0m0.392s 00:09:21.576 user 0m0.223s 00:09:21.576 sys 0m0.165s 00:09:21.576 13:28:00 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:21.576 13:28:00 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.576 13:28:00 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:21.576 ************************************ 00:09:21.576 END TEST bdev_json_nonarray 00:09:21.577 ************************************ 00:09:21.835 13:28:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:21.835 13:28:01 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:21.835 13:28:01 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:21.835 13:28:01 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:21.835 13:28:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:21.835 13:28:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.835 13:28:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:21.835 ************************************ 00:09:21.835 START TEST bdev_qos 00:09:21.835 ************************************ 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2058250 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2058250' 00:09:21.835 Process qos testing pid: 2058250 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2058250 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2058250 ']' 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:21.835 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:21.835 13:28:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:21.835 [2024-07-15 13:28:01.103643] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:21.835 [2024-07-15 13:28:01.103714] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2058250 ] 00:09:21.835 [2024-07-15 13:28:01.222857] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.094 [2024-07-15 13:28:01.324672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.030 Malloc_0 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.030 [ 00:09:23.030 { 00:09:23.030 "name": "Malloc_0", 00:09:23.030 "aliases": [ 00:09:23.030 "9ce9746c-968d-4247-8e49-90e0e9d4d501" 00:09:23.030 ], 00:09:23.030 "product_name": "Malloc disk", 00:09:23.030 "block_size": 512, 00:09:23.030 "num_blocks": 262144, 00:09:23.030 "uuid": "9ce9746c-968d-4247-8e49-90e0e9d4d501", 00:09:23.030 "assigned_rate_limits": { 00:09:23.030 "rw_ios_per_sec": 0, 00:09:23.030 "rw_mbytes_per_sec": 0, 00:09:23.030 "r_mbytes_per_sec": 0, 00:09:23.030 "w_mbytes_per_sec": 0 00:09:23.030 }, 00:09:23.030 "claimed": false, 00:09:23.030 "zoned": false, 00:09:23.030 "supported_io_types": { 00:09:23.030 "read": true, 00:09:23.030 "write": true, 00:09:23.030 "unmap": true, 00:09:23.030 "flush": true, 00:09:23.030 "reset": true, 00:09:23.030 "nvme_admin": false, 00:09:23.030 "nvme_io": false, 00:09:23.030 "nvme_io_md": false, 00:09:23.030 "write_zeroes": true, 00:09:23.030 "zcopy": true, 00:09:23.030 "get_zone_info": false, 00:09:23.030 "zone_management": false, 00:09:23.030 "zone_append": false, 00:09:23.030 "compare": false, 00:09:23.030 "compare_and_write": false, 00:09:23.030 "abort": true, 00:09:23.030 "seek_hole": false, 00:09:23.030 "seek_data": false, 00:09:23.030 "copy": true, 00:09:23.030 "nvme_iov_md": false 00:09:23.030 }, 00:09:23.030 "memory_domains": [ 00:09:23.030 { 00:09:23.030 "dma_device_id": "system", 00:09:23.030 "dma_device_type": 1 00:09:23.030 }, 00:09:23.030 { 00:09:23.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.030 "dma_device_type": 2 00:09:23.030 } 00:09:23.030 ], 00:09:23.030 "driver_specific": {} 00:09:23.030 } 00:09:23.030 ] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.030 Null_1 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:23.030 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:23.031 [ 00:09:23.031 { 00:09:23.031 "name": "Null_1", 00:09:23.031 "aliases": [ 00:09:23.031 "0e0ef539-9714-45ca-bebf-e9b788603f2c" 00:09:23.031 ], 00:09:23.031 "product_name": "Null disk", 00:09:23.031 "block_size": 512, 00:09:23.031 "num_blocks": 262144, 00:09:23.031 "uuid": "0e0ef539-9714-45ca-bebf-e9b788603f2c", 00:09:23.031 "assigned_rate_limits": { 00:09:23.031 "rw_ios_per_sec": 0, 00:09:23.031 "rw_mbytes_per_sec": 0, 00:09:23.031 "r_mbytes_per_sec": 0, 00:09:23.031 "w_mbytes_per_sec": 0 00:09:23.031 }, 00:09:23.031 "claimed": false, 00:09:23.031 "zoned": false, 00:09:23.031 "supported_io_types": { 00:09:23.031 "read": true, 00:09:23.031 "write": true, 00:09:23.031 "unmap": false, 00:09:23.031 "flush": false, 00:09:23.031 "reset": true, 00:09:23.031 "nvme_admin": false, 00:09:23.031 "nvme_io": false, 00:09:23.031 "nvme_io_md": false, 00:09:23.031 "write_zeroes": true, 00:09:23.031 "zcopy": false, 00:09:23.031 "get_zone_info": false, 00:09:23.031 "zone_management": false, 00:09:23.031 "zone_append": false, 00:09:23.031 "compare": false, 00:09:23.031 "compare_and_write": false, 00:09:23.031 "abort": true, 00:09:23.031 "seek_hole": false, 00:09:23.031 "seek_data": false, 00:09:23.031 "copy": false, 00:09:23.031 "nvme_iov_md": false 00:09:23.031 }, 00:09:23.031 "driver_specific": {} 00:09:23.031 } 00:09:23.031 ] 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:23.031 13:28:02 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:23.289 Running I/O for 60 seconds... 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 62909.41 251637.62 0.00 0.00 252928.00 0.00 0.00 ' 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=62909.41 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 62909 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=62909 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.560 13:28:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:28.560 ************************************ 00:09:28.560 START TEST bdev_qos_iops 00:09:28.560 ************************************ 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:28.560 13:28:07 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14999.33 59997.32 0.00 0.00 60900.00 0.00 0.00 ' 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14999.33 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14999 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14999 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14999 -lt 13500 ']' 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14999 -gt 16500 ']' 00:09:33.833 00:09:33.833 real 0m5.238s 00:09:33.833 user 0m0.114s 00:09:33.833 sys 0m0.046s 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:33.833 13:28:12 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:33.833 ************************************ 00:09:33.833 END TEST bdev_qos_iops 00:09:33.833 ************************************ 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:33.833 13:28:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20164.85 80659.41 0.00 0.00 81920.00 0.00 0.00 ' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:39.102 13:28:18 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.102 ************************************ 00:09:39.102 START TEST bdev_qos_bw 00:09:39.102 ************************************ 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:39.102 13:28:18 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.88 8191.51 0.00 0.00 8404.00 0.00 0.00 ' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8404.00 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8404 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8404 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8404 -lt 7372 ']' 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8404 -gt 9011 ']' 00:09:44.402 00:09:44.402 real 0m5.282s 00:09:44.402 user 0m0.115s 00:09:44.402 sys 0m0.043s 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:44.402 ************************************ 00:09:44.402 END TEST bdev_qos_bw 00:09:44.402 ************************************ 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:44.402 13:28:23 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:44.402 ************************************ 00:09:44.402 START TEST bdev_qos_ro_bw 00:09:44.402 ************************************ 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:44.402 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:44.403 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:44.403 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:44.403 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:44.403 13:28:23 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.66 2046.63 0.00 0.00 2060.00 0.00 0.00 ' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:09:49.678 00:09:49.678 real 0m5.181s 00:09:49.678 user 0m0.115s 00:09:49.678 sys 0m0.041s 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:49.678 13:28:28 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:49.678 ************************************ 00:09:49.678 END TEST bdev_qos_ro_bw 00:09:49.678 ************************************ 00:09:49.678 13:28:28 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:49.678 13:28:28 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:49.678 13:28:28 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.678 13:28:28 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.937 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.937 13:28:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:49.937 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.937 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.197 00:09:50.197 Latency(us) 00:09:50.197 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:50.197 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:50.197 Malloc_0 : 26.66 20777.54 81.16 0.00 0.00 12204.13 1994.57 503316.48 00:09:50.197 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:50.197 Null_1 : 26.81 20449.64 79.88 0.00 0.00 12484.52 819.20 150447.86 00:09:50.197 =================================================================================================================== 00:09:50.197 Total : 41227.18 161.04 0.00 0.00 12343.60 819.20 503316.48 00:09:50.197 0 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2058250 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2058250 ']' 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2058250 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2058250 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2058250' 00:09:50.197 killing process with pid 2058250 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2058250 00:09:50.197 Received shutdown signal, test time was about 26.865233 seconds 00:09:50.197 00:09:50.197 Latency(us) 00:09:50.197 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:50.197 =================================================================================================================== 00:09:50.197 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:50.197 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2058250 00:09:50.456 13:28:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:50.456 00:09:50.456 real 0m28.626s 00:09:50.456 user 0m29.614s 00:09:50.456 sys 0m0.844s 00:09:50.456 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:50.456 13:28:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.456 ************************************ 00:09:50.456 END TEST bdev_qos 00:09:50.456 ************************************ 00:09:50.456 13:28:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:50.456 13:28:29 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:50.456 13:28:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:50.456 13:28:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:50.456 13:28:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:50.456 ************************************ 00:09:50.456 START TEST bdev_qd_sampling 00:09:50.456 ************************************ 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2062049 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2062049' 00:09:50.456 Process bdev QD sampling period testing pid: 2062049 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2062049 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2062049 ']' 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:50.456 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:50.456 13:28:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:50.456 [2024-07-15 13:28:29.823366] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:50.456 [2024-07-15 13:28:29.823439] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2062049 ] 00:09:50.715 [2024-07-15 13:28:29.954983] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:50.715 [2024-07-15 13:28:30.066122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:50.715 [2024-07-15 13:28:30.066128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:51.652 Malloc_QD 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:51.652 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:51.652 [ 00:09:51.652 { 00:09:51.652 "name": "Malloc_QD", 00:09:51.652 "aliases": [ 00:09:51.652 "b16fddae-f51a-4952-b59c-5f9160b6afc5" 00:09:51.652 ], 00:09:51.652 "product_name": "Malloc disk", 00:09:51.652 "block_size": 512, 00:09:51.652 "num_blocks": 262144, 00:09:51.652 "uuid": "b16fddae-f51a-4952-b59c-5f9160b6afc5", 00:09:51.652 "assigned_rate_limits": { 00:09:51.652 "rw_ios_per_sec": 0, 00:09:51.652 "rw_mbytes_per_sec": 0, 00:09:51.652 "r_mbytes_per_sec": 0, 00:09:51.652 "w_mbytes_per_sec": 0 00:09:51.652 }, 00:09:51.652 "claimed": false, 00:09:51.652 "zoned": false, 00:09:51.652 "supported_io_types": { 00:09:51.652 "read": true, 00:09:51.652 "write": true, 00:09:51.652 "unmap": true, 00:09:51.652 "flush": true, 00:09:51.652 "reset": true, 00:09:51.652 "nvme_admin": false, 00:09:51.652 "nvme_io": false, 00:09:51.652 "nvme_io_md": false, 00:09:51.652 "write_zeroes": true, 00:09:51.652 "zcopy": true, 00:09:51.652 "get_zone_info": false, 00:09:51.652 "zone_management": false, 00:09:51.652 "zone_append": false, 00:09:51.652 "compare": false, 00:09:51.652 "compare_and_write": false, 00:09:51.652 "abort": true, 00:09:51.652 "seek_hole": false, 00:09:51.652 "seek_data": false, 00:09:51.652 "copy": true, 00:09:51.652 "nvme_iov_md": false 00:09:51.652 }, 00:09:51.652 "memory_domains": [ 00:09:51.652 { 00:09:51.652 "dma_device_id": "system", 00:09:51.652 "dma_device_type": 1 00:09:51.652 }, 00:09:51.652 { 00:09:51.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.652 "dma_device_type": 2 00:09:51.653 } 00:09:51.653 ], 00:09:51.653 "driver_specific": {} 00:09:51.653 } 00:09:51.653 ] 00:09:51.653 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:51.653 13:28:30 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:09:51.653 13:28:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:51.653 13:28:30 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:51.653 Running I/O for 5 seconds... 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:53.558 "tick_rate": 2300000000, 00:09:53.558 "ticks": 5359586329002576, 00:09:53.558 "bdevs": [ 00:09:53.558 { 00:09:53.558 "name": "Malloc_QD", 00:09:53.558 "bytes_read": 772846080, 00:09:53.558 "num_read_ops": 188676, 00:09:53.558 "bytes_written": 0, 00:09:53.558 "num_write_ops": 0, 00:09:53.558 "bytes_unmapped": 0, 00:09:53.558 "num_unmap_ops": 0, 00:09:53.558 "bytes_copied": 0, 00:09:53.558 "num_copy_ops": 0, 00:09:53.558 "read_latency_ticks": 2239982469344, 00:09:53.558 "max_read_latency_ticks": 14794118, 00:09:53.558 "min_read_latency_ticks": 275412, 00:09:53.558 "write_latency_ticks": 0, 00:09:53.558 "max_write_latency_ticks": 0, 00:09:53.558 "min_write_latency_ticks": 0, 00:09:53.558 "unmap_latency_ticks": 0, 00:09:53.558 "max_unmap_latency_ticks": 0, 00:09:53.558 "min_unmap_latency_ticks": 0, 00:09:53.558 "copy_latency_ticks": 0, 00:09:53.558 "max_copy_latency_ticks": 0, 00:09:53.558 "min_copy_latency_ticks": 0, 00:09:53.558 "io_error": {}, 00:09:53.558 "queue_depth_polling_period": 10, 00:09:53.558 "queue_depth": 512, 00:09:53.558 "io_time": 30, 00:09:53.558 "weighted_io_time": 15360 00:09:53.558 } 00:09:53.558 ] 00:09:53.558 }' 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.558 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:53.558 00:09:53.558 Latency(us) 00:09:53.558 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:53.558 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:53.558 Malloc_QD : 1.98 48800.15 190.63 0.00 0.00 5231.96 1923.34 5755.77 00:09:53.558 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:53.558 Malloc_QD : 1.98 50221.05 196.18 0.00 0.00 5084.58 1787.99 6439.62 00:09:53.558 =================================================================================================================== 00:09:53.559 Total : 99021.20 386.80 0.00 0.00 5157.21 1787.99 6439.62 00:09:53.559 0 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2062049 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2062049 ']' 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2062049 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:53.559 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2062049 00:09:53.817 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:53.817 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:53.817 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2062049' 00:09:53.817 killing process with pid 2062049 00:09:53.817 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2062049 00:09:53.817 Received shutdown signal, test time was about 2.060247 seconds 00:09:53.817 00:09:53.817 Latency(us) 00:09:53.817 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:53.817 =================================================================================================================== 00:09:53.817 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:53.817 13:28:32 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2062049 00:09:53.817 13:28:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:53.817 00:09:53.817 real 0m3.464s 00:09:53.817 user 0m6.761s 00:09:53.817 sys 0m0.430s 00:09:53.817 13:28:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:53.817 13:28:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:53.817 ************************************ 00:09:53.817 END TEST bdev_qd_sampling 00:09:53.817 ************************************ 00:09:54.075 13:28:33 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:54.075 13:28:33 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:54.075 13:28:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:54.075 13:28:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.075 13:28:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:54.075 ************************************ 00:09:54.075 START TEST bdev_error 00:09:54.075 ************************************ 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2062599 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2062599' 00:09:54.075 Process error testing pid: 2062599 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:54.075 13:28:33 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2062599 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2062599 ']' 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:54.075 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:54.075 13:28:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:54.075 [2024-07-15 13:28:33.375722] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:09:54.075 [2024-07-15 13:28:33.375795] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2062599 ] 00:09:54.075 [2024-07-15 13:28:33.499161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.333 [2024-07-15 13:28:33.597823] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:54.901 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:54.901 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:09:54.901 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:54.901 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.901 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 Dev_1 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 [ 00:09:55.160 { 00:09:55.160 "name": "Dev_1", 00:09:55.160 "aliases": [ 00:09:55.160 "b0119a4b-3738-4697-a862-579fc7d47cea" 00:09:55.160 ], 00:09:55.160 "product_name": "Malloc disk", 00:09:55.160 "block_size": 512, 00:09:55.160 "num_blocks": 262144, 00:09:55.160 "uuid": "b0119a4b-3738-4697-a862-579fc7d47cea", 00:09:55.160 "assigned_rate_limits": { 00:09:55.160 "rw_ios_per_sec": 0, 00:09:55.160 "rw_mbytes_per_sec": 0, 00:09:55.160 "r_mbytes_per_sec": 0, 00:09:55.160 "w_mbytes_per_sec": 0 00:09:55.160 }, 00:09:55.160 "claimed": false, 00:09:55.160 "zoned": false, 00:09:55.160 "supported_io_types": { 00:09:55.160 "read": true, 00:09:55.160 "write": true, 00:09:55.160 "unmap": true, 00:09:55.160 "flush": true, 00:09:55.160 "reset": true, 00:09:55.160 "nvme_admin": false, 00:09:55.160 "nvme_io": false, 00:09:55.160 "nvme_io_md": false, 00:09:55.160 "write_zeroes": true, 00:09:55.160 "zcopy": true, 00:09:55.160 "get_zone_info": false, 00:09:55.160 "zone_management": false, 00:09:55.160 "zone_append": false, 00:09:55.160 "compare": false, 00:09:55.160 "compare_and_write": false, 00:09:55.160 "abort": true, 00:09:55.160 "seek_hole": false, 00:09:55.160 "seek_data": false, 00:09:55.160 "copy": true, 00:09:55.160 "nvme_iov_md": false 00:09:55.160 }, 00:09:55.160 "memory_domains": [ 00:09:55.160 { 00:09:55.160 "dma_device_id": "system", 00:09:55.160 "dma_device_type": 1 00:09:55.160 }, 00:09:55.160 { 00:09:55.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.160 "dma_device_type": 2 00:09:55.160 } 00:09:55.160 ], 00:09:55.160 "driver_specific": {} 00:09:55.160 } 00:09:55.160 ] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 true 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 Dev_2 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 [ 00:09:55.160 { 00:09:55.160 "name": "Dev_2", 00:09:55.160 "aliases": [ 00:09:55.160 "c74e7c5c-2127-4264-a1b6-360880d03691" 00:09:55.160 ], 00:09:55.160 "product_name": "Malloc disk", 00:09:55.160 "block_size": 512, 00:09:55.160 "num_blocks": 262144, 00:09:55.160 "uuid": "c74e7c5c-2127-4264-a1b6-360880d03691", 00:09:55.160 "assigned_rate_limits": { 00:09:55.160 "rw_ios_per_sec": 0, 00:09:55.160 "rw_mbytes_per_sec": 0, 00:09:55.160 "r_mbytes_per_sec": 0, 00:09:55.160 "w_mbytes_per_sec": 0 00:09:55.160 }, 00:09:55.160 "claimed": false, 00:09:55.160 "zoned": false, 00:09:55.160 "supported_io_types": { 00:09:55.160 "read": true, 00:09:55.160 "write": true, 00:09:55.160 "unmap": true, 00:09:55.160 "flush": true, 00:09:55.160 "reset": true, 00:09:55.160 "nvme_admin": false, 00:09:55.160 "nvme_io": false, 00:09:55.160 "nvme_io_md": false, 00:09:55.160 "write_zeroes": true, 00:09:55.160 "zcopy": true, 00:09:55.160 "get_zone_info": false, 00:09:55.160 "zone_management": false, 00:09:55.160 "zone_append": false, 00:09:55.160 "compare": false, 00:09:55.160 "compare_and_write": false, 00:09:55.160 "abort": true, 00:09:55.160 "seek_hole": false, 00:09:55.160 "seek_data": false, 00:09:55.160 "copy": true, 00:09:55.160 "nvme_iov_md": false 00:09:55.160 }, 00:09:55.160 "memory_domains": [ 00:09:55.160 { 00:09:55.160 "dma_device_id": "system", 00:09:55.160 "dma_device_type": 1 00:09:55.160 }, 00:09:55.160 { 00:09:55.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.160 "dma_device_type": 2 00:09:55.160 } 00:09:55.160 ], 00:09:55.160 "driver_specific": {} 00:09:55.160 } 00:09:55.160 ] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:55.160 13:28:34 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:55.160 13:28:34 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:55.160 Running I/O for 5 seconds... 00:09:56.095 13:28:35 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2062599 00:09:56.095 13:28:35 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2062599' 00:09:56.095 Process is existed as continue on error is set. Pid: 2062599 00:09:56.095 13:28:35 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.095 13:28:35 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:56.095 13:28:35 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.095 13:28:35 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:56.353 Timeout while waiting for response: 00:09:56.353 00:09:56.353 00:10:00.544 00:10:00.544 Latency(us) 00:10:00.544 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:00.544 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:00.544 EE_Dev_1 : 0.92 37511.26 146.53 5.43 0.00 422.97 130.89 701.66 00:10:00.544 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:00.544 Dev_2 : 5.00 81006.87 316.43 0.00 0.00 193.97 66.34 22795.13 00:10:00.544 =================================================================================================================== 00:10:00.544 Total : 118518.14 462.96 5.43 0.00 211.98 66.34 22795.13 00:10:01.112 13:28:40 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2062599 00:10:01.112 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2062599 ']' 00:10:01.112 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2062599 00:10:01.112 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:01.112 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:01.112 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2062599 00:10:01.372 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:01.372 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:01.372 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2062599' 00:10:01.372 killing process with pid 2062599 00:10:01.372 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2062599 00:10:01.372 Received shutdown signal, test time was about 5.000000 seconds 00:10:01.372 00:10:01.372 Latency(us) 00:10:01.372 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.372 =================================================================================================================== 00:10:01.372 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:01.372 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2062599 00:10:01.631 13:28:40 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2063584 00:10:01.631 13:28:40 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2063584' 00:10:01.631 Process error testing pid: 2063584 00:10:01.631 13:28:40 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:01.631 13:28:40 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2063584 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2063584 ']' 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:01.631 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:01.631 13:28:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:01.631 [2024-07-15 13:28:40.889200] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:01.631 [2024-07-15 13:28:40.889276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2063584 ] 00:10:01.631 [2024-07-15 13:28:41.009771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:01.889 [2024-07-15 13:28:41.106603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:02.456 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.456 Dev_1 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.456 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.456 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.456 [ 00:10:02.456 { 00:10:02.456 "name": "Dev_1", 00:10:02.456 "aliases": [ 00:10:02.456 "5d0df43d-08fd-4c46-accf-bfd7e2e3a273" 00:10:02.456 ], 00:10:02.456 "product_name": "Malloc disk", 00:10:02.456 "block_size": 512, 00:10:02.456 "num_blocks": 262144, 00:10:02.456 "uuid": "5d0df43d-08fd-4c46-accf-bfd7e2e3a273", 00:10:02.456 "assigned_rate_limits": { 00:10:02.456 "rw_ios_per_sec": 0, 00:10:02.456 "rw_mbytes_per_sec": 0, 00:10:02.456 "r_mbytes_per_sec": 0, 00:10:02.456 "w_mbytes_per_sec": 0 00:10:02.456 }, 00:10:02.456 "claimed": false, 00:10:02.456 "zoned": false, 00:10:02.456 "supported_io_types": { 00:10:02.456 "read": true, 00:10:02.456 "write": true, 00:10:02.456 "unmap": true, 00:10:02.456 "flush": true, 00:10:02.456 "reset": true, 00:10:02.456 "nvme_admin": false, 00:10:02.456 "nvme_io": false, 00:10:02.456 "nvme_io_md": false, 00:10:02.456 "write_zeroes": true, 00:10:02.456 "zcopy": true, 00:10:02.456 "get_zone_info": false, 00:10:02.456 "zone_management": false, 00:10:02.456 "zone_append": false, 00:10:02.456 "compare": false, 00:10:02.456 "compare_and_write": false, 00:10:02.714 "abort": true, 00:10:02.714 "seek_hole": false, 00:10:02.714 "seek_data": false, 00:10:02.714 "copy": true, 00:10:02.714 "nvme_iov_md": false 00:10:02.714 }, 00:10:02.714 "memory_domains": [ 00:10:02.714 { 00:10:02.714 "dma_device_id": "system", 00:10:02.714 "dma_device_type": 1 00:10:02.714 }, 00:10:02.714 { 00:10:02.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.714 "dma_device_type": 2 00:10:02.714 } 00:10:02.714 ], 00:10:02.714 "driver_specific": {} 00:10:02.714 } 00:10:02.714 ] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.714 true 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.714 Dev_2 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.714 [ 00:10:02.714 { 00:10:02.714 "name": "Dev_2", 00:10:02.714 "aliases": [ 00:10:02.714 "65a08e09-35fc-49b2-8bfa-474ec8c5152e" 00:10:02.714 ], 00:10:02.714 "product_name": "Malloc disk", 00:10:02.714 "block_size": 512, 00:10:02.714 "num_blocks": 262144, 00:10:02.714 "uuid": "65a08e09-35fc-49b2-8bfa-474ec8c5152e", 00:10:02.714 "assigned_rate_limits": { 00:10:02.714 "rw_ios_per_sec": 0, 00:10:02.714 "rw_mbytes_per_sec": 0, 00:10:02.714 "r_mbytes_per_sec": 0, 00:10:02.714 "w_mbytes_per_sec": 0 00:10:02.714 }, 00:10:02.714 "claimed": false, 00:10:02.714 "zoned": false, 00:10:02.714 "supported_io_types": { 00:10:02.714 "read": true, 00:10:02.714 "write": true, 00:10:02.714 "unmap": true, 00:10:02.714 "flush": true, 00:10:02.714 "reset": true, 00:10:02.714 "nvme_admin": false, 00:10:02.714 "nvme_io": false, 00:10:02.714 "nvme_io_md": false, 00:10:02.714 "write_zeroes": true, 00:10:02.714 "zcopy": true, 00:10:02.714 "get_zone_info": false, 00:10:02.714 "zone_management": false, 00:10:02.714 "zone_append": false, 00:10:02.714 "compare": false, 00:10:02.714 "compare_and_write": false, 00:10:02.714 "abort": true, 00:10:02.714 "seek_hole": false, 00:10:02.714 "seek_data": false, 00:10:02.714 "copy": true, 00:10:02.714 "nvme_iov_md": false 00:10:02.714 }, 00:10:02.714 "memory_domains": [ 00:10:02.714 { 00:10:02.714 "dma_device_id": "system", 00:10:02.714 "dma_device_type": 1 00:10:02.714 }, 00:10:02.714 { 00:10:02.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.714 "dma_device_type": 2 00:10:02.714 } 00:10:02.714 ], 00:10:02.714 "driver_specific": {} 00:10:02.714 } 00:10:02.714 ] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2063584 00:10:02.714 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:02.714 13:28:41 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2063584 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:02.715 13:28:41 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2063584 00:10:02.715 Running I/O for 5 seconds... 00:10:02.715 task offset: 32704 on job bdev=EE_Dev_1 fails 00:10:02.715 00:10:02.715 Latency(us) 00:10:02.715 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:02.715 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:02.715 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:02.715 EE_Dev_1 : 0.00 29490.62 115.20 6702.41 0.00 364.93 134.46 651.80 00:10:02.715 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:02.715 Dev_2 : 0.00 18202.50 71.10 0.00 0.00 647.43 126.44 1203.87 00:10:02.715 =================================================================================================================== 00:10:02.715 Total : 47693.12 186.30 6702.41 0.00 518.15 126.44 1203.87 00:10:02.715 [2024-07-15 13:28:42.083461] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:02.715 request: 00:10:02.715 { 00:10:02.715 "method": "perform_tests", 00:10:02.715 "req_id": 1 00:10:02.715 } 00:10:02.715 Got JSON-RPC error response 00:10:02.715 response: 00:10:02.715 { 00:10:02.715 "code": -32603, 00:10:02.715 "message": "bdevperf failed with error Operation not permitted" 00:10:02.715 } 00:10:02.972 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:02.973 00:10:02.973 real 0m9.068s 00:10:02.973 user 0m9.479s 00:10:02.973 sys 0m0.866s 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:02.973 13:28:42 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:02.973 ************************************ 00:10:02.973 END TEST bdev_error 00:10:02.973 ************************************ 00:10:03.230 13:28:42 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:03.230 13:28:42 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:03.230 13:28:42 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:03.230 13:28:42 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:03.230 13:28:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:03.230 ************************************ 00:10:03.230 START TEST bdev_stat 00:10:03.230 ************************************ 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2063845 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2063845' 00:10:03.230 Process Bdev IO statistics testing pid: 2063845 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2063845 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2063845 ']' 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:03.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:03.230 13:28:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:03.231 [2024-07-15 13:28:42.527208] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:03.231 [2024-07-15 13:28:42.527276] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2063845 ] 00:10:03.487 [2024-07-15 13:28:42.656871] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:03.487 [2024-07-15 13:28:42.763876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:03.487 [2024-07-15 13:28:42.763882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.103 Malloc_STAT 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.103 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:04.103 [ 00:10:04.103 { 00:10:04.103 "name": "Malloc_STAT", 00:10:04.103 "aliases": [ 00:10:04.103 "dd5c0579-b103-416d-ab4d-37da45437bca" 00:10:04.103 ], 00:10:04.103 "product_name": "Malloc disk", 00:10:04.103 "block_size": 512, 00:10:04.103 "num_blocks": 262144, 00:10:04.103 "uuid": "dd5c0579-b103-416d-ab4d-37da45437bca", 00:10:04.103 "assigned_rate_limits": { 00:10:04.103 "rw_ios_per_sec": 0, 00:10:04.103 "rw_mbytes_per_sec": 0, 00:10:04.103 "r_mbytes_per_sec": 0, 00:10:04.103 "w_mbytes_per_sec": 0 00:10:04.103 }, 00:10:04.103 "claimed": false, 00:10:04.103 "zoned": false, 00:10:04.103 "supported_io_types": { 00:10:04.103 "read": true, 00:10:04.103 "write": true, 00:10:04.103 "unmap": true, 00:10:04.103 "flush": true, 00:10:04.103 "reset": true, 00:10:04.103 "nvme_admin": false, 00:10:04.103 "nvme_io": false, 00:10:04.103 "nvme_io_md": false, 00:10:04.103 "write_zeroes": true, 00:10:04.103 "zcopy": true, 00:10:04.103 "get_zone_info": false, 00:10:04.103 "zone_management": false, 00:10:04.103 "zone_append": false, 00:10:04.103 "compare": false, 00:10:04.103 "compare_and_write": false, 00:10:04.103 "abort": true, 00:10:04.103 "seek_hole": false, 00:10:04.103 "seek_data": false, 00:10:04.103 "copy": true, 00:10:04.103 "nvme_iov_md": false 00:10:04.103 }, 00:10:04.103 "memory_domains": [ 00:10:04.103 { 00:10:04.103 "dma_device_id": "system", 00:10:04.103 "dma_device_type": 1 00:10:04.103 }, 00:10:04.103 { 00:10:04.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.103 "dma_device_type": 2 00:10:04.104 } 00:10:04.104 ], 00:10:04.104 "driver_specific": {} 00:10:04.104 } 00:10:04.104 ] 00:10:04.104 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.104 13:28:43 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:04.104 13:28:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:04.104 13:28:43 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:04.361 Running I/O for 10 seconds... 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:06.267 "tick_rate": 2300000000, 00:10:06.267 "ticks": 5359615371475390, 00:10:06.267 "bdevs": [ 00:10:06.267 { 00:10:06.267 "name": "Malloc_STAT", 00:10:06.267 "bytes_read": 769700352, 00:10:06.267 "num_read_ops": 187908, 00:10:06.267 "bytes_written": 0, 00:10:06.267 "num_write_ops": 0, 00:10:06.267 "bytes_unmapped": 0, 00:10:06.267 "num_unmap_ops": 0, 00:10:06.267 "bytes_copied": 0, 00:10:06.267 "num_copy_ops": 0, 00:10:06.267 "read_latency_ticks": 2228470487558, 00:10:06.267 "max_read_latency_ticks": 14312044, 00:10:06.267 "min_read_latency_ticks": 270656, 00:10:06.267 "write_latency_ticks": 0, 00:10:06.267 "max_write_latency_ticks": 0, 00:10:06.267 "min_write_latency_ticks": 0, 00:10:06.267 "unmap_latency_ticks": 0, 00:10:06.267 "max_unmap_latency_ticks": 0, 00:10:06.267 "min_unmap_latency_ticks": 0, 00:10:06.267 "copy_latency_ticks": 0, 00:10:06.267 "max_copy_latency_ticks": 0, 00:10:06.267 "min_copy_latency_ticks": 0, 00:10:06.267 "io_error": {} 00:10:06.267 } 00:10:06.267 ] 00:10:06.267 }' 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=187908 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.267 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:06.267 "tick_rate": 2300000000, 00:10:06.267 "ticks": 5359615539955174, 00:10:06.267 "name": "Malloc_STAT", 00:10:06.267 "channels": [ 00:10:06.267 { 00:10:06.267 "thread_id": 2, 00:10:06.267 "bytes_read": 394264576, 00:10:06.267 "num_read_ops": 96256, 00:10:06.267 "bytes_written": 0, 00:10:06.267 "num_write_ops": 0, 00:10:06.267 "bytes_unmapped": 0, 00:10:06.267 "num_unmap_ops": 0, 00:10:06.267 "bytes_copied": 0, 00:10:06.267 "num_copy_ops": 0, 00:10:06.267 "read_latency_ticks": 1156348424784, 00:10:06.267 "max_read_latency_ticks": 12703506, 00:10:06.267 "min_read_latency_ticks": 7669124, 00:10:06.267 "write_latency_ticks": 0, 00:10:06.267 "max_write_latency_ticks": 0, 00:10:06.267 "min_write_latency_ticks": 0, 00:10:06.267 "unmap_latency_ticks": 0, 00:10:06.267 "max_unmap_latency_ticks": 0, 00:10:06.267 "min_unmap_latency_ticks": 0, 00:10:06.267 "copy_latency_ticks": 0, 00:10:06.267 "max_copy_latency_ticks": 0, 00:10:06.267 "min_copy_latency_ticks": 0 00:10:06.267 }, 00:10:06.267 { 00:10:06.267 "thread_id": 3, 00:10:06.267 "bytes_read": 404750336, 00:10:06.267 "num_read_ops": 98816, 00:10:06.267 "bytes_written": 0, 00:10:06.267 "num_write_ops": 0, 00:10:06.267 "bytes_unmapped": 0, 00:10:06.267 "num_unmap_ops": 0, 00:10:06.267 "bytes_copied": 0, 00:10:06.267 "num_copy_ops": 0, 00:10:06.267 "read_latency_ticks": 1157136373254, 00:10:06.267 "max_read_latency_ticks": 14312044, 00:10:06.267 "min_read_latency_ticks": 7686664, 00:10:06.267 "write_latency_ticks": 0, 00:10:06.267 "max_write_latency_ticks": 0, 00:10:06.267 "min_write_latency_ticks": 0, 00:10:06.267 "unmap_latency_ticks": 0, 00:10:06.267 "max_unmap_latency_ticks": 0, 00:10:06.267 "min_unmap_latency_ticks": 0, 00:10:06.267 "copy_latency_ticks": 0, 00:10:06.267 "max_copy_latency_ticks": 0, 00:10:06.267 "min_copy_latency_ticks": 0 00:10:06.267 } 00:10:06.267 ] 00:10:06.267 }' 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=96256 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=96256 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=98816 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=195072 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.268 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:06.526 "tick_rate": 2300000000, 00:10:06.526 "ticks": 5359615804764588, 00:10:06.526 "bdevs": [ 00:10:06.526 { 00:10:06.526 "name": "Malloc_STAT", 00:10:06.526 "bytes_read": 846246400, 00:10:06.526 "num_read_ops": 206596, 00:10:06.526 "bytes_written": 0, 00:10:06.526 "num_write_ops": 0, 00:10:06.526 "bytes_unmapped": 0, 00:10:06.526 "num_unmap_ops": 0, 00:10:06.526 "bytes_copied": 0, 00:10:06.526 "num_copy_ops": 0, 00:10:06.526 "read_latency_ticks": 2450230995388, 00:10:06.526 "max_read_latency_ticks": 14312044, 00:10:06.526 "min_read_latency_ticks": 270656, 00:10:06.526 "write_latency_ticks": 0, 00:10:06.526 "max_write_latency_ticks": 0, 00:10:06.526 "min_write_latency_ticks": 0, 00:10:06.526 "unmap_latency_ticks": 0, 00:10:06.526 "max_unmap_latency_ticks": 0, 00:10:06.526 "min_unmap_latency_ticks": 0, 00:10:06.526 "copy_latency_ticks": 0, 00:10:06.526 "max_copy_latency_ticks": 0, 00:10:06.526 "min_copy_latency_ticks": 0, 00:10:06.526 "io_error": {} 00:10:06.526 } 00:10:06.526 ] 00:10:06.526 }' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=206596 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195072 -lt 187908 ']' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195072 -gt 206596 ']' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:06.526 00:10:06.526 Latency(us) 00:10:06.526 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:06.526 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:06.526 Malloc_STAT : 2.16 48912.01 191.06 0.00 0.00 5221.10 1852.10 5527.82 00:10:06.526 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:06.526 Malloc_STAT : 2.16 50214.96 196.15 0.00 0.00 5085.83 1745.25 6240.17 00:10:06.526 =================================================================================================================== 00:10:06.526 Total : 99126.97 387.21 0.00 0.00 5152.58 1745.25 6240.17 00:10:06.526 0 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2063845 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2063845 ']' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2063845 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2063845 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2063845' 00:10:06.526 killing process with pid 2063845 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2063845 00:10:06.526 Received shutdown signal, test time was about 2.238963 seconds 00:10:06.526 00:10:06.526 Latency(us) 00:10:06.526 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:06.526 =================================================================================================================== 00:10:06.526 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:06.526 13:28:45 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2063845 00:10:06.786 13:28:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:06.786 00:10:06.786 real 0m3.582s 00:10:06.786 user 0m7.137s 00:10:06.786 sys 0m0.442s 00:10:06.786 13:28:46 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.786 13:28:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:06.786 ************************************ 00:10:06.786 END TEST bdev_stat 00:10:06.786 ************************************ 00:10:06.786 13:28:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:06.786 13:28:46 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:06.786 00:10:06.786 real 1m56.617s 00:10:06.786 user 7m10.778s 00:10:06.786 sys 0m23.275s 00:10:06.786 13:28:46 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.786 13:28:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:06.786 ************************************ 00:10:06.786 END TEST blockdev_general 00:10:06.786 ************************************ 00:10:06.786 13:28:46 -- common/autotest_common.sh@1142 -- # return 0 00:10:06.786 13:28:46 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:06.786 13:28:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:06.786 13:28:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:06.786 13:28:46 -- common/autotest_common.sh@10 -- # set +x 00:10:06.786 ************************************ 00:10:06.786 START TEST bdev_raid 00:10:06.786 ************************************ 00:10:06.786 13:28:46 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:07.045 * Looking for test storage... 00:10:07.045 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:07.045 13:28:46 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:07.045 13:28:46 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:07.046 13:28:46 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:07.046 13:28:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:07.046 13:28:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:07.046 13:28:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:07.046 ************************************ 00:10:07.046 START TEST raid_function_test_raid0 00:10:07.046 ************************************ 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2064441 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2064441' 00:10:07.046 Process raid pid: 2064441 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2064441 /var/tmp/spdk-raid.sock 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2064441 ']' 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:07.046 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:07.046 13:28:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:07.046 [2024-07-15 13:28:46.414604] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:07.046 [2024-07-15 13:28:46.414678] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:07.305 [2024-07-15 13:28:46.543053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.305 [2024-07-15 13:28:46.649401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.305 [2024-07-15 13:28:46.709759] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:07.305 [2024-07-15 13:28:46.709793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:08.279 [2024-07-15 13:28:47.597862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:08.279 [2024-07-15 13:28:47.599361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:08.279 [2024-07-15 13:28:47.599420] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1abd0 00:10:08.279 [2024-07-15 13:28:47.599431] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:08.279 [2024-07-15 13:28:47.599616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1ab10 00:10:08.279 [2024-07-15 13:28:47.599736] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1abd0 00:10:08.279 [2024-07-15 13:28:47.599749] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xf1abd0 00:10:08.279 [2024-07-15 13:28:47.599850] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:08.279 Base_1 00:10:08.279 Base_2 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:08.279 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:08.539 13:28:47 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:08.798 [2024-07-15 13:28:48.099215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10ce8e0 00:10:08.798 /dev/nbd0 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:08.798 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:08.798 1+0 records in 00:10:08.798 1+0 records out 00:10:08.798 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239733 s, 17.1 MB/s 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:09.057 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:09.316 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:09.316 { 00:10:09.317 "nbd_device": "/dev/nbd0", 00:10:09.317 "bdev_name": "raid" 00:10:09.317 } 00:10:09.317 ]' 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:09.317 { 00:10:09.317 "nbd_device": "/dev/nbd0", 00:10:09.317 "bdev_name": "raid" 00:10:09.317 } 00:10:09.317 ]' 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:09.317 4096+0 records in 00:10:09.317 4096+0 records out 00:10:09.317 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0297852 s, 70.4 MB/s 00:10:09.317 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:09.576 4096+0 records in 00:10:09.576 4096+0 records out 00:10:09.576 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.204936 s, 10.2 MB/s 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:09.576 128+0 records in 00:10:09.576 128+0 records out 00:10:09.576 65536 bytes (66 kB, 64 KiB) copied, 0.000827949 s, 79.2 MB/s 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:09.576 2035+0 records in 00:10:09.576 2035+0 records out 00:10:09.576 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.010763 s, 96.8 MB/s 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:09.576 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:09.576 456+0 records in 00:10:09.577 456+0 records out 00:10:09.577 233472 bytes (233 kB, 228 KiB) copied, 0.0027513 s, 84.9 MB/s 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:09.577 13:28:48 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:09.836 [2024-07-15 13:28:49.166199] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:09.836 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2064441 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2064441 ']' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2064441 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:10.095 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2064441 00:10:10.354 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:10.354 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:10.354 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2064441' 00:10:10.354 killing process with pid 2064441 00:10:10.354 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2064441 00:10:10.354 [2024-07-15 13:28:49.527177] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:10.354 [2024-07-15 13:28:49.527243] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:10.354 [2024-07-15 13:28:49.527286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:10.355 [2024-07-15 13:28:49.527303] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1abd0 name raid, state offline 00:10:10.355 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2064441 00:10:10.355 [2024-07-15 13:28:49.544329] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:10.355 13:28:49 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:10.355 00:10:10.355 real 0m3.415s 00:10:10.355 user 0m4.539s 00:10:10.355 sys 0m1.237s 00:10:10.355 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:10.355 13:28:49 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:10.355 ************************************ 00:10:10.355 END TEST raid_function_test_raid0 00:10:10.355 ************************************ 00:10:10.614 13:28:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:10.614 13:28:49 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:10.614 13:28:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:10.614 13:28:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:10.614 13:28:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:10.614 ************************************ 00:10:10.614 START TEST raid_function_test_concat 00:10:10.614 ************************************ 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2064916 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2064916' 00:10:10.614 Process raid pid: 2064916 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2064916 /var/tmp/spdk-raid.sock 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2064916 ']' 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:10.614 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:10.615 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:10.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:10.615 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:10.615 13:28:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:10.615 [2024-07-15 13:28:49.916279] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:10.615 [2024-07-15 13:28:49.916350] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:10.880 [2024-07-15 13:28:50.049718] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:10.880 [2024-07-15 13:28:50.155848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:10.880 [2024-07-15 13:28:50.210981] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:10.880 [2024-07-15 13:28:50.211007] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:11.449 13:28:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:11.449 13:28:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:11.449 13:28:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:11.449 13:28:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:11.450 13:28:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:11.450 13:28:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:11.450 13:28:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:11.709 [2024-07-15 13:28:51.053781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:11.709 [2024-07-15 13:28:51.055242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:11.709 [2024-07-15 13:28:51.055301] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1398bd0 00:10:11.709 [2024-07-15 13:28:51.055312] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:11.709 [2024-07-15 13:28:51.055499] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1398b10 00:10:11.709 [2024-07-15 13:28:51.055617] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1398bd0 00:10:11.709 [2024-07-15 13:28:51.055627] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1398bd0 00:10:11.709 [2024-07-15 13:28:51.055725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:11.709 Base_1 00:10:11.709 Base_2 00:10:11.709 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:11.709 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:11.709 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:11.968 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:12.537 [2024-07-15 13:28:51.811796] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154c8e0 00:10:12.538 /dev/nbd0 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.538 1+0 records in 00:10:12.538 1+0 records out 00:10:12.538 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002704 s, 15.1 MB/s 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:12.538 13:28:51 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:12.797 { 00:10:12.797 "nbd_device": "/dev/nbd0", 00:10:12.797 "bdev_name": "raid" 00:10:12.797 } 00:10:12.797 ]' 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:12.797 { 00:10:12.797 "nbd_device": "/dev/nbd0", 00:10:12.797 "bdev_name": "raid" 00:10:12.797 } 00:10:12.797 ]' 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:12.797 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:13.056 4096+0 records in 00:10:13.056 4096+0 records out 00:10:13.056 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0300979 s, 69.7 MB/s 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:13.056 4096+0 records in 00:10:13.056 4096+0 records out 00:10:13.056 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.208147 s, 10.1 MB/s 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:13.056 128+0 records in 00:10:13.056 128+0 records out 00:10:13.056 65536 bytes (66 kB, 64 KiB) copied, 0.000841332 s, 77.9 MB/s 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:13.056 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:13.315 2035+0 records in 00:10:13.315 2035+0 records out 00:10:13.315 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0108076 s, 96.4 MB/s 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:13.315 456+0 records in 00:10:13.315 456+0 records out 00:10:13.315 233472 bytes (233 kB, 228 KiB) copied, 0.00275714 s, 84.7 MB/s 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:13.315 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:13.573 [2024-07-15 13:28:52.808274] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:13.573 13:28:52 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2064916 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2064916 ']' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2064916 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2064916 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2064916' 00:10:13.832 killing process with pid 2064916 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2064916 00:10:13.832 [2024-07-15 13:28:53.165156] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:13.832 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2064916 00:10:13.832 [2024-07-15 13:28:53.165229] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:13.832 [2024-07-15 13:28:53.165272] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:13.832 [2024-07-15 13:28:53.165285] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1398bd0 name raid, state offline 00:10:13.832 [2024-07-15 13:28:53.181427] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:14.090 13:28:53 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:14.090 00:10:14.090 real 0m3.543s 00:10:14.090 user 0m4.806s 00:10:14.090 sys 0m1.296s 00:10:14.090 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.090 13:28:53 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:14.090 ************************************ 00:10:14.090 END TEST raid_function_test_concat 00:10:14.090 ************************************ 00:10:14.090 13:28:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:14.090 13:28:53 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:14.090 13:28:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:14.090 13:28:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.090 13:28:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:14.090 ************************************ 00:10:14.090 START TEST raid0_resize_test 00:10:14.090 ************************************ 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2065520 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2065520' 00:10:14.090 Process raid pid: 2065520 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2065520 /var/tmp/spdk-raid.sock 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2065520 ']' 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:14.090 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:14.090 13:28:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:14.347 [2024-07-15 13:28:53.530465] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:14.347 [2024-07-15 13:28:53.530535] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:14.347 [2024-07-15 13:28:53.661489] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.347 [2024-07-15 13:28:53.768873] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.605 [2024-07-15 13:28:53.836346] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:14.605 [2024-07-15 13:28:53.836383] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:15.171 13:28:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:15.171 13:28:54 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:15.171 13:28:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:15.429 Base_1 00:10:15.429 13:28:54 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:15.996 Base_2 00:10:15.996 13:28:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:16.254 [2024-07-15 13:28:55.429597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:16.254 [2024-07-15 13:28:55.430984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:16.254 [2024-07-15 13:28:55.431033] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c0780 00:10:16.254 [2024-07-15 13:28:55.431043] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:16.254 [2024-07-15 13:28:55.431250] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x120c020 00:10:16.254 [2024-07-15 13:28:55.431343] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c0780 00:10:16.254 [2024-07-15 13:28:55.431353] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x16c0780 00:10:16.254 [2024-07-15 13:28:55.431462] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:16.254 13:28:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:16.511 [2024-07-15 13:28:55.930918] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:16.511 [2024-07-15 13:28:55.930951] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:16.511 true 00:10:16.770 13:28:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:16.770 13:28:55 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:16.770 [2024-07-15 13:28:56.187724] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:17.028 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:17.028 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:17.028 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:17.028 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:17.028 [2024-07-15 13:28:56.432186] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:17.028 [2024-07-15 13:28:56.432206] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:17.028 [2024-07-15 13:28:56.432233] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:17.028 true 00:10:17.028 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:17.286 [2024-07-15 13:28:56.677004] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2065520 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2065520 ']' 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2065520 00:10:17.286 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2065520 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2065520' 00:10:17.544 killing process with pid 2065520 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2065520 00:10:17.544 [2024-07-15 13:28:56.754228] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2065520 00:10:17.544 [2024-07-15 13:28:56.754281] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:17.544 [2024-07-15 13:28:56.754322] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:17.544 [2024-07-15 13:28:56.754333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c0780 name Raid, state offline 00:10:17.544 [2024-07-15 13:28:56.755610] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:17.544 00:10:17.544 real 0m3.490s 00:10:17.544 user 0m5.533s 00:10:17.544 sys 0m0.728s 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:17.544 13:28:56 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:17.544 ************************************ 00:10:17.544 END TEST raid0_resize_test 00:10:17.544 ************************************ 00:10:17.803 13:28:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:17.803 13:28:57 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:17.803 13:28:57 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:17.803 13:28:57 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:17.803 13:28:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:17.803 13:28:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:17.803 13:28:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:17.803 ************************************ 00:10:17.803 START TEST raid_state_function_test 00:10:17.803 ************************************ 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2066020 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2066020' 00:10:17.803 Process raid pid: 2066020 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2066020 /var/tmp/spdk-raid.sock 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2066020 ']' 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:17.803 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:17.803 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:17.803 [2024-07-15 13:28:57.117598] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:17.803 [2024-07-15 13:28:57.117676] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:18.061 [2024-07-15 13:28:57.247260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.061 [2024-07-15 13:28:57.347210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.061 [2024-07-15 13:28:57.411531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.061 [2024-07-15 13:28:57.411562] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.626 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:18.626 13:28:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:18.626 13:28:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:18.884 [2024-07-15 13:28:58.202712] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:18.884 [2024-07-15 13:28:58.202755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:18.884 [2024-07-15 13:28:58.202766] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:18.884 [2024-07-15 13:28:58.202778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:18.884 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:19.142 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:19.142 "name": "Existed_Raid", 00:10:19.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.142 "strip_size_kb": 64, 00:10:19.142 "state": "configuring", 00:10:19.142 "raid_level": "raid0", 00:10:19.142 "superblock": false, 00:10:19.142 "num_base_bdevs": 2, 00:10:19.142 "num_base_bdevs_discovered": 0, 00:10:19.142 "num_base_bdevs_operational": 2, 00:10:19.142 "base_bdevs_list": [ 00:10:19.142 { 00:10:19.142 "name": "BaseBdev1", 00:10:19.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.142 "is_configured": false, 00:10:19.142 "data_offset": 0, 00:10:19.142 "data_size": 0 00:10:19.142 }, 00:10:19.142 { 00:10:19.142 "name": "BaseBdev2", 00:10:19.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.142 "is_configured": false, 00:10:19.142 "data_offset": 0, 00:10:19.142 "data_size": 0 00:10:19.142 } 00:10:19.142 ] 00:10:19.142 }' 00:10:19.142 13:28:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:19.142 13:28:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:19.707 13:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:19.964 [2024-07-15 13:28:59.261406] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:19.964 [2024-07-15 13:28:59.261440] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2168a80 name Existed_Raid, state configuring 00:10:19.964 13:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:20.222 [2024-07-15 13:28:59.437884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:20.222 [2024-07-15 13:28:59.437913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:20.222 [2024-07-15 13:28:59.437923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:20.222 [2024-07-15 13:28:59.437942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:20.222 13:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:20.222 [2024-07-15 13:28:59.632253] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:20.222 BaseBdev1 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:20.480 13:28:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:20.739 [ 00:10:20.739 { 00:10:20.739 "name": "BaseBdev1", 00:10:20.739 "aliases": [ 00:10:20.739 "a6ad6a8d-88c7-4549-a8fd-d494de1966ef" 00:10:20.739 ], 00:10:20.739 "product_name": "Malloc disk", 00:10:20.739 "block_size": 512, 00:10:20.739 "num_blocks": 65536, 00:10:20.739 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:20.739 "assigned_rate_limits": { 00:10:20.739 "rw_ios_per_sec": 0, 00:10:20.739 "rw_mbytes_per_sec": 0, 00:10:20.739 "r_mbytes_per_sec": 0, 00:10:20.739 "w_mbytes_per_sec": 0 00:10:20.739 }, 00:10:20.739 "claimed": true, 00:10:20.739 "claim_type": "exclusive_write", 00:10:20.739 "zoned": false, 00:10:20.739 "supported_io_types": { 00:10:20.739 "read": true, 00:10:20.739 "write": true, 00:10:20.739 "unmap": true, 00:10:20.739 "flush": true, 00:10:20.739 "reset": true, 00:10:20.739 "nvme_admin": false, 00:10:20.739 "nvme_io": false, 00:10:20.739 "nvme_io_md": false, 00:10:20.739 "write_zeroes": true, 00:10:20.739 "zcopy": true, 00:10:20.739 "get_zone_info": false, 00:10:20.739 "zone_management": false, 00:10:20.739 "zone_append": false, 00:10:20.739 "compare": false, 00:10:20.739 "compare_and_write": false, 00:10:20.739 "abort": true, 00:10:20.739 "seek_hole": false, 00:10:20.739 "seek_data": false, 00:10:20.739 "copy": true, 00:10:20.739 "nvme_iov_md": false 00:10:20.739 }, 00:10:20.739 "memory_domains": [ 00:10:20.739 { 00:10:20.739 "dma_device_id": "system", 00:10:20.739 "dma_device_type": 1 00:10:20.739 }, 00:10:20.739 { 00:10:20.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:20.739 "dma_device_type": 2 00:10:20.739 } 00:10:20.739 ], 00:10:20.739 "driver_specific": {} 00:10:20.739 } 00:10:20.739 ] 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.739 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:20.997 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.997 "name": "Existed_Raid", 00:10:20.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.997 "strip_size_kb": 64, 00:10:20.997 "state": "configuring", 00:10:20.997 "raid_level": "raid0", 00:10:20.997 "superblock": false, 00:10:20.997 "num_base_bdevs": 2, 00:10:20.997 "num_base_bdevs_discovered": 1, 00:10:20.997 "num_base_bdevs_operational": 2, 00:10:20.997 "base_bdevs_list": [ 00:10:20.997 { 00:10:20.997 "name": "BaseBdev1", 00:10:20.997 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:20.997 "is_configured": true, 00:10:20.997 "data_offset": 0, 00:10:20.997 "data_size": 65536 00:10:20.997 }, 00:10:20.997 { 00:10:20.997 "name": "BaseBdev2", 00:10:20.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.997 "is_configured": false, 00:10:20.997 "data_offset": 0, 00:10:20.997 "data_size": 0 00:10:20.997 } 00:10:20.997 ] 00:10:20.997 }' 00:10:20.997 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.997 13:29:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:21.564 13:29:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:21.823 [2024-07-15 13:29:01.148291] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:21.823 [2024-07-15 13:29:01.148333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2168350 name Existed_Raid, state configuring 00:10:21.823 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:22.082 [2024-07-15 13:29:01.324791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:22.082 [2024-07-15 13:29:01.326339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:22.082 [2024-07-15 13:29:01.326372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:22.082 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:22.341 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:22.341 "name": "Existed_Raid", 00:10:22.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.341 "strip_size_kb": 64, 00:10:22.341 "state": "configuring", 00:10:22.341 "raid_level": "raid0", 00:10:22.341 "superblock": false, 00:10:22.341 "num_base_bdevs": 2, 00:10:22.341 "num_base_bdevs_discovered": 1, 00:10:22.341 "num_base_bdevs_operational": 2, 00:10:22.341 "base_bdevs_list": [ 00:10:22.341 { 00:10:22.341 "name": "BaseBdev1", 00:10:22.341 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:22.341 "is_configured": true, 00:10:22.341 "data_offset": 0, 00:10:22.341 "data_size": 65536 00:10:22.341 }, 00:10:22.341 { 00:10:22.341 "name": "BaseBdev2", 00:10:22.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.341 "is_configured": false, 00:10:22.341 "data_offset": 0, 00:10:22.341 "data_size": 0 00:10:22.341 } 00:10:22.341 ] 00:10:22.341 }' 00:10:22.341 13:29:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:22.341 13:29:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:22.909 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:23.168 [2024-07-15 13:29:02.355015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:23.168 [2024-07-15 13:29:02.355054] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2169000 00:10:23.168 [2024-07-15 13:29:02.355063] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:23.168 [2024-07-15 13:29:02.355253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20830c0 00:10:23.168 [2024-07-15 13:29:02.355374] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2169000 00:10:23.168 [2024-07-15 13:29:02.355384] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2169000 00:10:23.168 [2024-07-15 13:29:02.355550] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:23.168 BaseBdev2 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:23.168 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:23.428 [ 00:10:23.428 { 00:10:23.428 "name": "BaseBdev2", 00:10:23.428 "aliases": [ 00:10:23.428 "ae11ef79-97c9-4e0b-913d-6051b41c53a7" 00:10:23.428 ], 00:10:23.428 "product_name": "Malloc disk", 00:10:23.428 "block_size": 512, 00:10:23.428 "num_blocks": 65536, 00:10:23.428 "uuid": "ae11ef79-97c9-4e0b-913d-6051b41c53a7", 00:10:23.428 "assigned_rate_limits": { 00:10:23.428 "rw_ios_per_sec": 0, 00:10:23.428 "rw_mbytes_per_sec": 0, 00:10:23.428 "r_mbytes_per_sec": 0, 00:10:23.428 "w_mbytes_per_sec": 0 00:10:23.428 }, 00:10:23.428 "claimed": true, 00:10:23.428 "claim_type": "exclusive_write", 00:10:23.428 "zoned": false, 00:10:23.428 "supported_io_types": { 00:10:23.428 "read": true, 00:10:23.428 "write": true, 00:10:23.428 "unmap": true, 00:10:23.428 "flush": true, 00:10:23.428 "reset": true, 00:10:23.428 "nvme_admin": false, 00:10:23.428 "nvme_io": false, 00:10:23.428 "nvme_io_md": false, 00:10:23.428 "write_zeroes": true, 00:10:23.428 "zcopy": true, 00:10:23.428 "get_zone_info": false, 00:10:23.428 "zone_management": false, 00:10:23.428 "zone_append": false, 00:10:23.428 "compare": false, 00:10:23.428 "compare_and_write": false, 00:10:23.428 "abort": true, 00:10:23.428 "seek_hole": false, 00:10:23.428 "seek_data": false, 00:10:23.428 "copy": true, 00:10:23.428 "nvme_iov_md": false 00:10:23.428 }, 00:10:23.428 "memory_domains": [ 00:10:23.428 { 00:10:23.428 "dma_device_id": "system", 00:10:23.428 "dma_device_type": 1 00:10:23.428 }, 00:10:23.428 { 00:10:23.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:23.428 "dma_device_type": 2 00:10:23.428 } 00:10:23.428 ], 00:10:23.428 "driver_specific": {} 00:10:23.428 } 00:10:23.428 ] 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.428 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:23.688 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:23.688 "name": "Existed_Raid", 00:10:23.688 "uuid": "1a0bdda4-3e0d-4a68-8b47-d6e91a222a44", 00:10:23.688 "strip_size_kb": 64, 00:10:23.688 "state": "online", 00:10:23.688 "raid_level": "raid0", 00:10:23.688 "superblock": false, 00:10:23.688 "num_base_bdevs": 2, 00:10:23.688 "num_base_bdevs_discovered": 2, 00:10:23.688 "num_base_bdevs_operational": 2, 00:10:23.688 "base_bdevs_list": [ 00:10:23.688 { 00:10:23.688 "name": "BaseBdev1", 00:10:23.688 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:23.688 "is_configured": true, 00:10:23.688 "data_offset": 0, 00:10:23.688 "data_size": 65536 00:10:23.688 }, 00:10:23.688 { 00:10:23.688 "name": "BaseBdev2", 00:10:23.688 "uuid": "ae11ef79-97c9-4e0b-913d-6051b41c53a7", 00:10:23.688 "is_configured": true, 00:10:23.688 "data_offset": 0, 00:10:23.688 "data_size": 65536 00:10:23.688 } 00:10:23.688 ] 00:10:23.688 }' 00:10:23.688 13:29:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:23.688 13:29:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:24.292 [2024-07-15 13:29:03.662775] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:24.292 "name": "Existed_Raid", 00:10:24.292 "aliases": [ 00:10:24.292 "1a0bdda4-3e0d-4a68-8b47-d6e91a222a44" 00:10:24.292 ], 00:10:24.292 "product_name": "Raid Volume", 00:10:24.292 "block_size": 512, 00:10:24.292 "num_blocks": 131072, 00:10:24.292 "uuid": "1a0bdda4-3e0d-4a68-8b47-d6e91a222a44", 00:10:24.292 "assigned_rate_limits": { 00:10:24.292 "rw_ios_per_sec": 0, 00:10:24.292 "rw_mbytes_per_sec": 0, 00:10:24.292 "r_mbytes_per_sec": 0, 00:10:24.292 "w_mbytes_per_sec": 0 00:10:24.292 }, 00:10:24.292 "claimed": false, 00:10:24.292 "zoned": false, 00:10:24.292 "supported_io_types": { 00:10:24.292 "read": true, 00:10:24.292 "write": true, 00:10:24.292 "unmap": true, 00:10:24.292 "flush": true, 00:10:24.292 "reset": true, 00:10:24.292 "nvme_admin": false, 00:10:24.292 "nvme_io": false, 00:10:24.292 "nvme_io_md": false, 00:10:24.292 "write_zeroes": true, 00:10:24.292 "zcopy": false, 00:10:24.292 "get_zone_info": false, 00:10:24.292 "zone_management": false, 00:10:24.292 "zone_append": false, 00:10:24.292 "compare": false, 00:10:24.292 "compare_and_write": false, 00:10:24.292 "abort": false, 00:10:24.292 "seek_hole": false, 00:10:24.292 "seek_data": false, 00:10:24.292 "copy": false, 00:10:24.292 "nvme_iov_md": false 00:10:24.292 }, 00:10:24.292 "memory_domains": [ 00:10:24.292 { 00:10:24.292 "dma_device_id": "system", 00:10:24.292 "dma_device_type": 1 00:10:24.292 }, 00:10:24.292 { 00:10:24.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:24.292 "dma_device_type": 2 00:10:24.292 }, 00:10:24.292 { 00:10:24.292 "dma_device_id": "system", 00:10:24.292 "dma_device_type": 1 00:10:24.292 }, 00:10:24.292 { 00:10:24.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:24.292 "dma_device_type": 2 00:10:24.292 } 00:10:24.292 ], 00:10:24.292 "driver_specific": { 00:10:24.292 "raid": { 00:10:24.292 "uuid": "1a0bdda4-3e0d-4a68-8b47-d6e91a222a44", 00:10:24.292 "strip_size_kb": 64, 00:10:24.292 "state": "online", 00:10:24.292 "raid_level": "raid0", 00:10:24.292 "superblock": false, 00:10:24.292 "num_base_bdevs": 2, 00:10:24.292 "num_base_bdevs_discovered": 2, 00:10:24.292 "num_base_bdevs_operational": 2, 00:10:24.292 "base_bdevs_list": [ 00:10:24.292 { 00:10:24.292 "name": "BaseBdev1", 00:10:24.292 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:24.292 "is_configured": true, 00:10:24.292 "data_offset": 0, 00:10:24.292 "data_size": 65536 00:10:24.292 }, 00:10:24.292 { 00:10:24.292 "name": "BaseBdev2", 00:10:24.292 "uuid": "ae11ef79-97c9-4e0b-913d-6051b41c53a7", 00:10:24.292 "is_configured": true, 00:10:24.292 "data_offset": 0, 00:10:24.292 "data_size": 65536 00:10:24.292 } 00:10:24.292 ] 00:10:24.292 } 00:10:24.292 } 00:10:24.292 }' 00:10:24.292 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:24.551 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:24.551 BaseBdev2' 00:10:24.551 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:24.551 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:24.551 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:24.810 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:24.810 "name": "BaseBdev1", 00:10:24.810 "aliases": [ 00:10:24.810 "a6ad6a8d-88c7-4549-a8fd-d494de1966ef" 00:10:24.810 ], 00:10:24.810 "product_name": "Malloc disk", 00:10:24.810 "block_size": 512, 00:10:24.810 "num_blocks": 65536, 00:10:24.810 "uuid": "a6ad6a8d-88c7-4549-a8fd-d494de1966ef", 00:10:24.810 "assigned_rate_limits": { 00:10:24.810 "rw_ios_per_sec": 0, 00:10:24.810 "rw_mbytes_per_sec": 0, 00:10:24.810 "r_mbytes_per_sec": 0, 00:10:24.810 "w_mbytes_per_sec": 0 00:10:24.810 }, 00:10:24.810 "claimed": true, 00:10:24.810 "claim_type": "exclusive_write", 00:10:24.810 "zoned": false, 00:10:24.810 "supported_io_types": { 00:10:24.810 "read": true, 00:10:24.810 "write": true, 00:10:24.810 "unmap": true, 00:10:24.810 "flush": true, 00:10:24.810 "reset": true, 00:10:24.810 "nvme_admin": false, 00:10:24.810 "nvme_io": false, 00:10:24.810 "nvme_io_md": false, 00:10:24.810 "write_zeroes": true, 00:10:24.810 "zcopy": true, 00:10:24.810 "get_zone_info": false, 00:10:24.810 "zone_management": false, 00:10:24.810 "zone_append": false, 00:10:24.810 "compare": false, 00:10:24.810 "compare_and_write": false, 00:10:24.810 "abort": true, 00:10:24.810 "seek_hole": false, 00:10:24.810 "seek_data": false, 00:10:24.810 "copy": true, 00:10:24.810 "nvme_iov_md": false 00:10:24.810 }, 00:10:24.810 "memory_domains": [ 00:10:24.810 { 00:10:24.810 "dma_device_id": "system", 00:10:24.810 "dma_device_type": 1 00:10:24.810 }, 00:10:24.810 { 00:10:24.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:24.810 "dma_device_type": 2 00:10:24.810 } 00:10:24.810 ], 00:10:24.810 "driver_specific": {} 00:10:24.810 }' 00:10:24.810 13:29:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:24.810 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:25.070 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:25.329 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:25.329 "name": "BaseBdev2", 00:10:25.329 "aliases": [ 00:10:25.329 "ae11ef79-97c9-4e0b-913d-6051b41c53a7" 00:10:25.330 ], 00:10:25.330 "product_name": "Malloc disk", 00:10:25.330 "block_size": 512, 00:10:25.330 "num_blocks": 65536, 00:10:25.330 "uuid": "ae11ef79-97c9-4e0b-913d-6051b41c53a7", 00:10:25.330 "assigned_rate_limits": { 00:10:25.330 "rw_ios_per_sec": 0, 00:10:25.330 "rw_mbytes_per_sec": 0, 00:10:25.330 "r_mbytes_per_sec": 0, 00:10:25.330 "w_mbytes_per_sec": 0 00:10:25.330 }, 00:10:25.330 "claimed": true, 00:10:25.330 "claim_type": "exclusive_write", 00:10:25.330 "zoned": false, 00:10:25.330 "supported_io_types": { 00:10:25.330 "read": true, 00:10:25.330 "write": true, 00:10:25.330 "unmap": true, 00:10:25.330 "flush": true, 00:10:25.330 "reset": true, 00:10:25.330 "nvme_admin": false, 00:10:25.330 "nvme_io": false, 00:10:25.330 "nvme_io_md": false, 00:10:25.330 "write_zeroes": true, 00:10:25.330 "zcopy": true, 00:10:25.330 "get_zone_info": false, 00:10:25.330 "zone_management": false, 00:10:25.330 "zone_append": false, 00:10:25.330 "compare": false, 00:10:25.330 "compare_and_write": false, 00:10:25.330 "abort": true, 00:10:25.330 "seek_hole": false, 00:10:25.330 "seek_data": false, 00:10:25.330 "copy": true, 00:10:25.330 "nvme_iov_md": false 00:10:25.330 }, 00:10:25.330 "memory_domains": [ 00:10:25.330 { 00:10:25.330 "dma_device_id": "system", 00:10:25.330 "dma_device_type": 1 00:10:25.330 }, 00:10:25.330 { 00:10:25.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:25.330 "dma_device_type": 2 00:10:25.330 } 00:10:25.330 ], 00:10:25.330 "driver_specific": {} 00:10:25.330 }' 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:25.330 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:25.589 13:29:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:25.847 [2024-07-15 13:29:05.142447] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:25.847 [2024-07-15 13:29:05.142473] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:25.847 [2024-07-15 13:29:05.142514] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.847 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:26.106 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:26.106 "name": "Existed_Raid", 00:10:26.106 "uuid": "1a0bdda4-3e0d-4a68-8b47-d6e91a222a44", 00:10:26.106 "strip_size_kb": 64, 00:10:26.106 "state": "offline", 00:10:26.106 "raid_level": "raid0", 00:10:26.106 "superblock": false, 00:10:26.106 "num_base_bdevs": 2, 00:10:26.106 "num_base_bdevs_discovered": 1, 00:10:26.106 "num_base_bdevs_operational": 1, 00:10:26.106 "base_bdevs_list": [ 00:10:26.106 { 00:10:26.106 "name": null, 00:10:26.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:26.106 "is_configured": false, 00:10:26.106 "data_offset": 0, 00:10:26.106 "data_size": 65536 00:10:26.106 }, 00:10:26.106 { 00:10:26.106 "name": "BaseBdev2", 00:10:26.106 "uuid": "ae11ef79-97c9-4e0b-913d-6051b41c53a7", 00:10:26.106 "is_configured": true, 00:10:26.106 "data_offset": 0, 00:10:26.106 "data_size": 65536 00:10:26.106 } 00:10:26.106 ] 00:10:26.106 }' 00:10:26.106 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:26.106 13:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.674 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:26.674 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:26.674 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:26.674 13:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:26.933 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:26.933 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:26.933 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:27.192 [2024-07-15 13:29:06.442955] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:27.192 [2024-07-15 13:29:06.443008] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2169000 name Existed_Raid, state offline 00:10:27.192 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:27.192 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:27.192 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.192 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2066020 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2066020 ']' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2066020 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2066020 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2066020' 00:10:27.451 killing process with pid 2066020 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2066020 00:10:27.451 [2024-07-15 13:29:06.777579] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:27.451 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2066020 00:10:27.451 [2024-07-15 13:29:06.778559] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:27.710 13:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:27.710 00:10:27.710 real 0m9.953s 00:10:27.710 user 0m17.565s 00:10:27.710 sys 0m1.925s 00:10:27.710 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:27.710 13:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:27.710 ************************************ 00:10:27.710 END TEST raid_state_function_test 00:10:27.710 ************************************ 00:10:27.710 13:29:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:27.710 13:29:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:27.710 13:29:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:27.710 13:29:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:27.710 13:29:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:27.710 ************************************ 00:10:27.710 START TEST raid_state_function_test_sb 00:10:27.710 ************************************ 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2067552 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2067552' 00:10:27.710 Process raid pid: 2067552 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2067552 /var/tmp/spdk-raid.sock 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2067552 ']' 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:27.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:27.710 13:29:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:27.710 [2024-07-15 13:29:07.118679] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:27.710 [2024-07-15 13:29:07.118741] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:27.970 [2024-07-15 13:29:07.246061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:27.970 [2024-07-15 13:29:07.347886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:28.229 [2024-07-15 13:29:07.408311] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:28.229 [2024-07-15 13:29:07.408336] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:28.797 13:29:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:28.797 13:29:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:28.797 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:29.365 [2024-07-15 13:29:08.517378] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:29.365 [2024-07-15 13:29:08.517421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:29.365 [2024-07-15 13:29:08.517432] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:29.365 [2024-07-15 13:29:08.517444] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.365 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:29.624 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.624 "name": "Existed_Raid", 00:10:29.624 "uuid": "b96634cb-8fca-4f17-9a24-81777fea449d", 00:10:29.624 "strip_size_kb": 64, 00:10:29.624 "state": "configuring", 00:10:29.624 "raid_level": "raid0", 00:10:29.624 "superblock": true, 00:10:29.624 "num_base_bdevs": 2, 00:10:29.624 "num_base_bdevs_discovered": 0, 00:10:29.624 "num_base_bdevs_operational": 2, 00:10:29.624 "base_bdevs_list": [ 00:10:29.624 { 00:10:29.624 "name": "BaseBdev1", 00:10:29.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.624 "is_configured": false, 00:10:29.624 "data_offset": 0, 00:10:29.624 "data_size": 0 00:10:29.624 }, 00:10:29.624 { 00:10:29.624 "name": "BaseBdev2", 00:10:29.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:29.624 "is_configured": false, 00:10:29.624 "data_offset": 0, 00:10:29.624 "data_size": 0 00:10:29.624 } 00:10:29.624 ] 00:10:29.624 }' 00:10:29.624 13:29:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.624 13:29:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:30.190 13:29:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:30.477 [2024-07-15 13:29:09.620140] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:30.477 [2024-07-15 13:29:09.620170] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1983a80 name Existed_Raid, state configuring 00:10:30.477 13:29:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.477 [2024-07-15 13:29:09.868824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.477 [2024-07-15 13:29:09.868849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.477 [2024-07-15 13:29:09.868858] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.477 [2024-07-15 13:29:09.868870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.477 13:29:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:30.735 [2024-07-15 13:29:10.123260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:30.735 BaseBdev1 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:30.735 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:31.000 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:31.258 [ 00:10:31.258 { 00:10:31.258 "name": "BaseBdev1", 00:10:31.258 "aliases": [ 00:10:31.258 "2fc95e0b-f0c3-43da-866d-b61d67b5247a" 00:10:31.258 ], 00:10:31.258 "product_name": "Malloc disk", 00:10:31.258 "block_size": 512, 00:10:31.258 "num_blocks": 65536, 00:10:31.258 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:31.258 "assigned_rate_limits": { 00:10:31.258 "rw_ios_per_sec": 0, 00:10:31.258 "rw_mbytes_per_sec": 0, 00:10:31.258 "r_mbytes_per_sec": 0, 00:10:31.258 "w_mbytes_per_sec": 0 00:10:31.258 }, 00:10:31.258 "claimed": true, 00:10:31.258 "claim_type": "exclusive_write", 00:10:31.258 "zoned": false, 00:10:31.258 "supported_io_types": { 00:10:31.258 "read": true, 00:10:31.258 "write": true, 00:10:31.258 "unmap": true, 00:10:31.258 "flush": true, 00:10:31.258 "reset": true, 00:10:31.258 "nvme_admin": false, 00:10:31.258 "nvme_io": false, 00:10:31.258 "nvme_io_md": false, 00:10:31.258 "write_zeroes": true, 00:10:31.258 "zcopy": true, 00:10:31.258 "get_zone_info": false, 00:10:31.258 "zone_management": false, 00:10:31.258 "zone_append": false, 00:10:31.258 "compare": false, 00:10:31.258 "compare_and_write": false, 00:10:31.258 "abort": true, 00:10:31.258 "seek_hole": false, 00:10:31.258 "seek_data": false, 00:10:31.258 "copy": true, 00:10:31.258 "nvme_iov_md": false 00:10:31.258 }, 00:10:31.258 "memory_domains": [ 00:10:31.258 { 00:10:31.258 "dma_device_id": "system", 00:10:31.258 "dma_device_type": 1 00:10:31.258 }, 00:10:31.258 { 00:10:31.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.258 "dma_device_type": 2 00:10:31.258 } 00:10:31.258 ], 00:10:31.258 "driver_specific": {} 00:10:31.258 } 00:10:31.258 ] 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.258 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:31.516 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:31.516 "name": "Existed_Raid", 00:10:31.516 "uuid": "731bf7f7-2c91-4da1-b841-53e875a52aeb", 00:10:31.516 "strip_size_kb": 64, 00:10:31.516 "state": "configuring", 00:10:31.516 "raid_level": "raid0", 00:10:31.516 "superblock": true, 00:10:31.516 "num_base_bdevs": 2, 00:10:31.516 "num_base_bdevs_discovered": 1, 00:10:31.516 "num_base_bdevs_operational": 2, 00:10:31.516 "base_bdevs_list": [ 00:10:31.516 { 00:10:31.516 "name": "BaseBdev1", 00:10:31.516 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:31.516 "is_configured": true, 00:10:31.516 "data_offset": 2048, 00:10:31.516 "data_size": 63488 00:10:31.516 }, 00:10:31.516 { 00:10:31.516 "name": "BaseBdev2", 00:10:31.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.516 "is_configured": false, 00:10:31.516 "data_offset": 0, 00:10:31.516 "data_size": 0 00:10:31.516 } 00:10:31.516 ] 00:10:31.516 }' 00:10:31.516 13:29:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:31.516 13:29:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:32.083 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:32.342 [2024-07-15 13:29:11.703462] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:32.342 [2024-07-15 13:29:11.703504] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1983350 name Existed_Raid, state configuring 00:10:32.342 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:32.602 [2024-07-15 13:29:11.948158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:32.602 [2024-07-15 13:29:11.949767] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:32.602 [2024-07-15 13:29:11.949800] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.602 13:29:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.861 13:29:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.861 "name": "Existed_Raid", 00:10:32.861 "uuid": "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08", 00:10:32.861 "strip_size_kb": 64, 00:10:32.861 "state": "configuring", 00:10:32.861 "raid_level": "raid0", 00:10:32.861 "superblock": true, 00:10:32.861 "num_base_bdevs": 2, 00:10:32.861 "num_base_bdevs_discovered": 1, 00:10:32.861 "num_base_bdevs_operational": 2, 00:10:32.861 "base_bdevs_list": [ 00:10:32.861 { 00:10:32.861 "name": "BaseBdev1", 00:10:32.861 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:32.861 "is_configured": true, 00:10:32.861 "data_offset": 2048, 00:10:32.861 "data_size": 63488 00:10:32.861 }, 00:10:32.861 { 00:10:32.861 "name": "BaseBdev2", 00:10:32.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.861 "is_configured": false, 00:10:32.861 "data_offset": 0, 00:10:32.861 "data_size": 0 00:10:32.861 } 00:10:32.861 ] 00:10:32.861 }' 00:10:32.861 13:29:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.861 13:29:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:33.429 13:29:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:33.687 [2024-07-15 13:29:13.042500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:33.687 [2024-07-15 13:29:13.042659] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1984000 00:10:33.687 [2024-07-15 13:29:13.042673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:33.687 [2024-07-15 13:29:13.042852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x189e0c0 00:10:33.687 [2024-07-15 13:29:13.042980] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1984000 00:10:33.687 [2024-07-15 13:29:13.042991] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1984000 00:10:33.687 [2024-07-15 13:29:13.043086] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:33.687 BaseBdev2 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:33.687 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:33.948 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:34.208 [ 00:10:34.208 { 00:10:34.208 "name": "BaseBdev2", 00:10:34.208 "aliases": [ 00:10:34.208 "d4111ddc-6618-41ca-a1ce-fe395124326b" 00:10:34.208 ], 00:10:34.208 "product_name": "Malloc disk", 00:10:34.208 "block_size": 512, 00:10:34.208 "num_blocks": 65536, 00:10:34.208 "uuid": "d4111ddc-6618-41ca-a1ce-fe395124326b", 00:10:34.208 "assigned_rate_limits": { 00:10:34.208 "rw_ios_per_sec": 0, 00:10:34.208 "rw_mbytes_per_sec": 0, 00:10:34.208 "r_mbytes_per_sec": 0, 00:10:34.208 "w_mbytes_per_sec": 0 00:10:34.208 }, 00:10:34.208 "claimed": true, 00:10:34.208 "claim_type": "exclusive_write", 00:10:34.208 "zoned": false, 00:10:34.208 "supported_io_types": { 00:10:34.208 "read": true, 00:10:34.208 "write": true, 00:10:34.208 "unmap": true, 00:10:34.208 "flush": true, 00:10:34.208 "reset": true, 00:10:34.208 "nvme_admin": false, 00:10:34.208 "nvme_io": false, 00:10:34.208 "nvme_io_md": false, 00:10:34.208 "write_zeroes": true, 00:10:34.208 "zcopy": true, 00:10:34.208 "get_zone_info": false, 00:10:34.208 "zone_management": false, 00:10:34.208 "zone_append": false, 00:10:34.208 "compare": false, 00:10:34.208 "compare_and_write": false, 00:10:34.208 "abort": true, 00:10:34.208 "seek_hole": false, 00:10:34.208 "seek_data": false, 00:10:34.208 "copy": true, 00:10:34.208 "nvme_iov_md": false 00:10:34.208 }, 00:10:34.208 "memory_domains": [ 00:10:34.208 { 00:10:34.208 "dma_device_id": "system", 00:10:34.208 "dma_device_type": 1 00:10:34.208 }, 00:10:34.208 { 00:10:34.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.208 "dma_device_type": 2 00:10:34.208 } 00:10:34.208 ], 00:10:34.208 "driver_specific": {} 00:10:34.208 } 00:10:34.208 ] 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.208 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.468 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.468 "name": "Existed_Raid", 00:10:34.468 "uuid": "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08", 00:10:34.468 "strip_size_kb": 64, 00:10:34.468 "state": "online", 00:10:34.468 "raid_level": "raid0", 00:10:34.468 "superblock": true, 00:10:34.468 "num_base_bdevs": 2, 00:10:34.468 "num_base_bdevs_discovered": 2, 00:10:34.468 "num_base_bdevs_operational": 2, 00:10:34.468 "base_bdevs_list": [ 00:10:34.468 { 00:10:34.468 "name": "BaseBdev1", 00:10:34.468 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:34.468 "is_configured": true, 00:10:34.468 "data_offset": 2048, 00:10:34.468 "data_size": 63488 00:10:34.468 }, 00:10:34.468 { 00:10:34.468 "name": "BaseBdev2", 00:10:34.468 "uuid": "d4111ddc-6618-41ca-a1ce-fe395124326b", 00:10:34.468 "is_configured": true, 00:10:34.468 "data_offset": 2048, 00:10:34.468 "data_size": 63488 00:10:34.468 } 00:10:34.468 ] 00:10:34.468 }' 00:10:34.468 13:29:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.468 13:29:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:35.036 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:35.296 [2024-07-15 13:29:14.566808] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:35.296 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:35.296 "name": "Existed_Raid", 00:10:35.296 "aliases": [ 00:10:35.296 "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08" 00:10:35.296 ], 00:10:35.296 "product_name": "Raid Volume", 00:10:35.296 "block_size": 512, 00:10:35.296 "num_blocks": 126976, 00:10:35.296 "uuid": "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08", 00:10:35.296 "assigned_rate_limits": { 00:10:35.296 "rw_ios_per_sec": 0, 00:10:35.296 "rw_mbytes_per_sec": 0, 00:10:35.296 "r_mbytes_per_sec": 0, 00:10:35.296 "w_mbytes_per_sec": 0 00:10:35.296 }, 00:10:35.296 "claimed": false, 00:10:35.296 "zoned": false, 00:10:35.296 "supported_io_types": { 00:10:35.296 "read": true, 00:10:35.296 "write": true, 00:10:35.296 "unmap": true, 00:10:35.296 "flush": true, 00:10:35.296 "reset": true, 00:10:35.296 "nvme_admin": false, 00:10:35.296 "nvme_io": false, 00:10:35.296 "nvme_io_md": false, 00:10:35.296 "write_zeroes": true, 00:10:35.296 "zcopy": false, 00:10:35.296 "get_zone_info": false, 00:10:35.296 "zone_management": false, 00:10:35.296 "zone_append": false, 00:10:35.296 "compare": false, 00:10:35.296 "compare_and_write": false, 00:10:35.296 "abort": false, 00:10:35.296 "seek_hole": false, 00:10:35.296 "seek_data": false, 00:10:35.296 "copy": false, 00:10:35.296 "nvme_iov_md": false 00:10:35.296 }, 00:10:35.296 "memory_domains": [ 00:10:35.296 { 00:10:35.296 "dma_device_id": "system", 00:10:35.296 "dma_device_type": 1 00:10:35.296 }, 00:10:35.296 { 00:10:35.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.296 "dma_device_type": 2 00:10:35.296 }, 00:10:35.296 { 00:10:35.296 "dma_device_id": "system", 00:10:35.296 "dma_device_type": 1 00:10:35.296 }, 00:10:35.296 { 00:10:35.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.296 "dma_device_type": 2 00:10:35.296 } 00:10:35.296 ], 00:10:35.296 "driver_specific": { 00:10:35.296 "raid": { 00:10:35.296 "uuid": "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08", 00:10:35.296 "strip_size_kb": 64, 00:10:35.296 "state": "online", 00:10:35.296 "raid_level": "raid0", 00:10:35.296 "superblock": true, 00:10:35.296 "num_base_bdevs": 2, 00:10:35.296 "num_base_bdevs_discovered": 2, 00:10:35.296 "num_base_bdevs_operational": 2, 00:10:35.296 "base_bdevs_list": [ 00:10:35.296 { 00:10:35.296 "name": "BaseBdev1", 00:10:35.296 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:35.296 "is_configured": true, 00:10:35.296 "data_offset": 2048, 00:10:35.296 "data_size": 63488 00:10:35.296 }, 00:10:35.296 { 00:10:35.296 "name": "BaseBdev2", 00:10:35.296 "uuid": "d4111ddc-6618-41ca-a1ce-fe395124326b", 00:10:35.296 "is_configured": true, 00:10:35.296 "data_offset": 2048, 00:10:35.296 "data_size": 63488 00:10:35.296 } 00:10:35.296 ] 00:10:35.296 } 00:10:35.296 } 00:10:35.296 }' 00:10:35.297 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:35.297 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:35.297 BaseBdev2' 00:10:35.297 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.297 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:35.297 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:35.556 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:35.556 "name": "BaseBdev1", 00:10:35.556 "aliases": [ 00:10:35.556 "2fc95e0b-f0c3-43da-866d-b61d67b5247a" 00:10:35.556 ], 00:10:35.556 "product_name": "Malloc disk", 00:10:35.556 "block_size": 512, 00:10:35.556 "num_blocks": 65536, 00:10:35.556 "uuid": "2fc95e0b-f0c3-43da-866d-b61d67b5247a", 00:10:35.556 "assigned_rate_limits": { 00:10:35.556 "rw_ios_per_sec": 0, 00:10:35.556 "rw_mbytes_per_sec": 0, 00:10:35.556 "r_mbytes_per_sec": 0, 00:10:35.556 "w_mbytes_per_sec": 0 00:10:35.556 }, 00:10:35.556 "claimed": true, 00:10:35.556 "claim_type": "exclusive_write", 00:10:35.556 "zoned": false, 00:10:35.556 "supported_io_types": { 00:10:35.556 "read": true, 00:10:35.556 "write": true, 00:10:35.556 "unmap": true, 00:10:35.556 "flush": true, 00:10:35.556 "reset": true, 00:10:35.556 "nvme_admin": false, 00:10:35.556 "nvme_io": false, 00:10:35.556 "nvme_io_md": false, 00:10:35.556 "write_zeroes": true, 00:10:35.556 "zcopy": true, 00:10:35.556 "get_zone_info": false, 00:10:35.556 "zone_management": false, 00:10:35.556 "zone_append": false, 00:10:35.556 "compare": false, 00:10:35.556 "compare_and_write": false, 00:10:35.556 "abort": true, 00:10:35.556 "seek_hole": false, 00:10:35.556 "seek_data": false, 00:10:35.556 "copy": true, 00:10:35.556 "nvme_iov_md": false 00:10:35.556 }, 00:10:35.556 "memory_domains": [ 00:10:35.556 { 00:10:35.556 "dma_device_id": "system", 00:10:35.556 "dma_device_type": 1 00:10:35.556 }, 00:10:35.556 { 00:10:35.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.556 "dma_device_type": 2 00:10:35.556 } 00:10:35.556 ], 00:10:35.556 "driver_specific": {} 00:10:35.556 }' 00:10:35.556 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.556 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.556 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:35.556 13:29:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:35.815 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:36.075 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:36.075 "name": "BaseBdev2", 00:10:36.075 "aliases": [ 00:10:36.075 "d4111ddc-6618-41ca-a1ce-fe395124326b" 00:10:36.075 ], 00:10:36.075 "product_name": "Malloc disk", 00:10:36.075 "block_size": 512, 00:10:36.075 "num_blocks": 65536, 00:10:36.075 "uuid": "d4111ddc-6618-41ca-a1ce-fe395124326b", 00:10:36.075 "assigned_rate_limits": { 00:10:36.075 "rw_ios_per_sec": 0, 00:10:36.075 "rw_mbytes_per_sec": 0, 00:10:36.075 "r_mbytes_per_sec": 0, 00:10:36.075 "w_mbytes_per_sec": 0 00:10:36.075 }, 00:10:36.075 "claimed": true, 00:10:36.075 "claim_type": "exclusive_write", 00:10:36.075 "zoned": false, 00:10:36.075 "supported_io_types": { 00:10:36.075 "read": true, 00:10:36.075 "write": true, 00:10:36.075 "unmap": true, 00:10:36.075 "flush": true, 00:10:36.075 "reset": true, 00:10:36.075 "nvme_admin": false, 00:10:36.075 "nvme_io": false, 00:10:36.075 "nvme_io_md": false, 00:10:36.075 "write_zeroes": true, 00:10:36.075 "zcopy": true, 00:10:36.075 "get_zone_info": false, 00:10:36.075 "zone_management": false, 00:10:36.075 "zone_append": false, 00:10:36.075 "compare": false, 00:10:36.075 "compare_and_write": false, 00:10:36.075 "abort": true, 00:10:36.075 "seek_hole": false, 00:10:36.075 "seek_data": false, 00:10:36.075 "copy": true, 00:10:36.075 "nvme_iov_md": false 00:10:36.075 }, 00:10:36.075 "memory_domains": [ 00:10:36.075 { 00:10:36.075 "dma_device_id": "system", 00:10:36.075 "dma_device_type": 1 00:10:36.075 }, 00:10:36.075 { 00:10:36.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.075 "dma_device_type": 2 00:10:36.075 } 00:10:36.075 ], 00:10:36.075 "driver_specific": {} 00:10:36.075 }' 00:10:36.075 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.075 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.075 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:36.075 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.334 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:36.593 [2024-07-15 13:29:15.970302] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:36.593 [2024-07-15 13:29:15.970327] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.593 [2024-07-15 13:29:15.970368] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.593 13:29:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:36.852 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:36.852 "name": "Existed_Raid", 00:10:36.852 "uuid": "9eb99fdd-9e65-4a7d-be0e-6c91e44c0d08", 00:10:36.852 "strip_size_kb": 64, 00:10:36.852 "state": "offline", 00:10:36.852 "raid_level": "raid0", 00:10:36.852 "superblock": true, 00:10:36.852 "num_base_bdevs": 2, 00:10:36.852 "num_base_bdevs_discovered": 1, 00:10:36.852 "num_base_bdevs_operational": 1, 00:10:36.852 "base_bdevs_list": [ 00:10:36.852 { 00:10:36.852 "name": null, 00:10:36.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.852 "is_configured": false, 00:10:36.852 "data_offset": 2048, 00:10:36.852 "data_size": 63488 00:10:36.852 }, 00:10:36.852 { 00:10:36.852 "name": "BaseBdev2", 00:10:36.852 "uuid": "d4111ddc-6618-41ca-a1ce-fe395124326b", 00:10:36.852 "is_configured": true, 00:10:36.852 "data_offset": 2048, 00:10:36.852 "data_size": 63488 00:10:36.852 } 00:10:36.852 ] 00:10:36.852 }' 00:10:36.852 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:36.852 13:29:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:37.420 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:37.420 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:37.420 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.420 13:29:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:37.678 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:37.678 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:37.678 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:37.936 [2024-07-15 13:29:17.234656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:37.936 [2024-07-15 13:29:17.234704] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1984000 name Existed_Raid, state offline 00:10:37.936 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:37.936 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:37.936 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.936 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2067552 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2067552 ']' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2067552 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2067552 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2067552' 00:10:38.194 killing process with pid 2067552 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2067552 00:10:38.194 [2024-07-15 13:29:17.563907] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:38.194 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2067552 00:10:38.194 [2024-07-15 13:29:17.564875] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:38.453 13:29:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:38.453 00:10:38.453 real 0m10.720s 00:10:38.453 user 0m19.098s 00:10:38.453 sys 0m1.955s 00:10:38.453 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:38.453 13:29:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:38.453 ************************************ 00:10:38.453 END TEST raid_state_function_test_sb 00:10:38.453 ************************************ 00:10:38.453 13:29:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:38.453 13:29:17 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:38.453 13:29:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:38.453 13:29:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.453 13:29:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:38.453 ************************************ 00:10:38.453 START TEST raid_superblock_test 00:10:38.453 ************************************ 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2069186 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2069186 /var/tmp/spdk-raid.sock 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2069186 ']' 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:38.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:38.453 13:29:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.711 [2024-07-15 13:29:17.916472] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:38.711 [2024-07-15 13:29:17.916542] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2069186 ] 00:10:38.711 [2024-07-15 13:29:18.035703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.969 [2024-07-15 13:29:18.140604] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:38.969 [2024-07-15 13:29:18.215630] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.969 [2024-07-15 13:29:18.215664] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:39.535 13:29:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:39.535 13:29:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:39.535 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:39.535 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:39.535 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:39.536 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:39.794 malloc1 00:10:39.794 13:29:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:40.052 [2024-07-15 13:29:19.227836] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:40.052 [2024-07-15 13:29:19.227883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:40.052 [2024-07-15 13:29:19.227905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x149b570 00:10:40.052 [2024-07-15 13:29:19.227918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:40.052 [2024-07-15 13:29:19.229646] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:40.052 [2024-07-15 13:29:19.229678] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:40.052 pt1 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:40.052 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:40.344 malloc2 00:10:40.344 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:40.603 [2024-07-15 13:29:19.775342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:40.603 [2024-07-15 13:29:19.775388] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:40.603 [2024-07-15 13:29:19.775408] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x149c970 00:10:40.603 [2024-07-15 13:29:19.775420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:40.604 [2024-07-15 13:29:19.777091] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:40.604 [2024-07-15 13:29:19.777119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:40.604 pt2 00:10:40.604 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:40.604 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:40.604 13:29:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:40.604 [2024-07-15 13:29:20.020020] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:40.604 [2024-07-15 13:29:20.021416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:40.604 [2024-07-15 13:29:20.021570] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x163f270 00:10:40.604 [2024-07-15 13:29:20.021583] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:40.604 [2024-07-15 13:29:20.021787] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1634c10 00:10:40.604 [2024-07-15 13:29:20.021945] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x163f270 00:10:40.604 [2024-07-15 13:29:20.021956] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x163f270 00:10:40.604 [2024-07-15 13:29:20.022060] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.862 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:41.121 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.122 "name": "raid_bdev1", 00:10:41.122 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:41.122 "strip_size_kb": 64, 00:10:41.122 "state": "online", 00:10:41.122 "raid_level": "raid0", 00:10:41.122 "superblock": true, 00:10:41.122 "num_base_bdevs": 2, 00:10:41.122 "num_base_bdevs_discovered": 2, 00:10:41.122 "num_base_bdevs_operational": 2, 00:10:41.122 "base_bdevs_list": [ 00:10:41.122 { 00:10:41.122 "name": "pt1", 00:10:41.122 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:41.122 "is_configured": true, 00:10:41.122 "data_offset": 2048, 00:10:41.122 "data_size": 63488 00:10:41.122 }, 00:10:41.122 { 00:10:41.122 "name": "pt2", 00:10:41.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:41.122 "is_configured": true, 00:10:41.122 "data_offset": 2048, 00:10:41.122 "data_size": 63488 00:10:41.122 } 00:10:41.122 ] 00:10:41.122 }' 00:10:41.122 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.122 13:29:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:41.688 13:29:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:41.688 [2024-07-15 13:29:21.107097] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:41.947 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:41.947 "name": "raid_bdev1", 00:10:41.947 "aliases": [ 00:10:41.947 "3c989d8a-5b38-4025-8d26-45521b352829" 00:10:41.947 ], 00:10:41.947 "product_name": "Raid Volume", 00:10:41.947 "block_size": 512, 00:10:41.947 "num_blocks": 126976, 00:10:41.947 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:41.947 "assigned_rate_limits": { 00:10:41.947 "rw_ios_per_sec": 0, 00:10:41.947 "rw_mbytes_per_sec": 0, 00:10:41.947 "r_mbytes_per_sec": 0, 00:10:41.947 "w_mbytes_per_sec": 0 00:10:41.947 }, 00:10:41.947 "claimed": false, 00:10:41.947 "zoned": false, 00:10:41.947 "supported_io_types": { 00:10:41.947 "read": true, 00:10:41.947 "write": true, 00:10:41.947 "unmap": true, 00:10:41.947 "flush": true, 00:10:41.947 "reset": true, 00:10:41.947 "nvme_admin": false, 00:10:41.947 "nvme_io": false, 00:10:41.947 "nvme_io_md": false, 00:10:41.947 "write_zeroes": true, 00:10:41.947 "zcopy": false, 00:10:41.947 "get_zone_info": false, 00:10:41.947 "zone_management": false, 00:10:41.947 "zone_append": false, 00:10:41.947 "compare": false, 00:10:41.947 "compare_and_write": false, 00:10:41.947 "abort": false, 00:10:41.947 "seek_hole": false, 00:10:41.947 "seek_data": false, 00:10:41.947 "copy": false, 00:10:41.947 "nvme_iov_md": false 00:10:41.947 }, 00:10:41.947 "memory_domains": [ 00:10:41.947 { 00:10:41.947 "dma_device_id": "system", 00:10:41.947 "dma_device_type": 1 00:10:41.947 }, 00:10:41.947 { 00:10:41.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.947 "dma_device_type": 2 00:10:41.947 }, 00:10:41.947 { 00:10:41.947 "dma_device_id": "system", 00:10:41.947 "dma_device_type": 1 00:10:41.947 }, 00:10:41.947 { 00:10:41.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.947 "dma_device_type": 2 00:10:41.947 } 00:10:41.947 ], 00:10:41.947 "driver_specific": { 00:10:41.947 "raid": { 00:10:41.947 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:41.947 "strip_size_kb": 64, 00:10:41.947 "state": "online", 00:10:41.947 "raid_level": "raid0", 00:10:41.947 "superblock": true, 00:10:41.947 "num_base_bdevs": 2, 00:10:41.947 "num_base_bdevs_discovered": 2, 00:10:41.947 "num_base_bdevs_operational": 2, 00:10:41.947 "base_bdevs_list": [ 00:10:41.947 { 00:10:41.947 "name": "pt1", 00:10:41.947 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:41.947 "is_configured": true, 00:10:41.947 "data_offset": 2048, 00:10:41.947 "data_size": 63488 00:10:41.947 }, 00:10:41.947 { 00:10:41.947 "name": "pt2", 00:10:41.947 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:41.947 "is_configured": true, 00:10:41.947 "data_offset": 2048, 00:10:41.947 "data_size": 63488 00:10:41.947 } 00:10:41.947 ] 00:10:41.947 } 00:10:41.947 } 00:10:41.947 }' 00:10:41.947 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:41.947 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:41.947 pt2' 00:10:41.947 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:41.948 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:41.948 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:42.206 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:42.206 "name": "pt1", 00:10:42.206 "aliases": [ 00:10:42.206 "00000000-0000-0000-0000-000000000001" 00:10:42.206 ], 00:10:42.206 "product_name": "passthru", 00:10:42.206 "block_size": 512, 00:10:42.206 "num_blocks": 65536, 00:10:42.206 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:42.206 "assigned_rate_limits": { 00:10:42.206 "rw_ios_per_sec": 0, 00:10:42.206 "rw_mbytes_per_sec": 0, 00:10:42.206 "r_mbytes_per_sec": 0, 00:10:42.206 "w_mbytes_per_sec": 0 00:10:42.206 }, 00:10:42.206 "claimed": true, 00:10:42.206 "claim_type": "exclusive_write", 00:10:42.206 "zoned": false, 00:10:42.206 "supported_io_types": { 00:10:42.206 "read": true, 00:10:42.206 "write": true, 00:10:42.206 "unmap": true, 00:10:42.206 "flush": true, 00:10:42.206 "reset": true, 00:10:42.206 "nvme_admin": false, 00:10:42.206 "nvme_io": false, 00:10:42.206 "nvme_io_md": false, 00:10:42.206 "write_zeroes": true, 00:10:42.206 "zcopy": true, 00:10:42.206 "get_zone_info": false, 00:10:42.206 "zone_management": false, 00:10:42.206 "zone_append": false, 00:10:42.206 "compare": false, 00:10:42.207 "compare_and_write": false, 00:10:42.207 "abort": true, 00:10:42.207 "seek_hole": false, 00:10:42.207 "seek_data": false, 00:10:42.207 "copy": true, 00:10:42.207 "nvme_iov_md": false 00:10:42.207 }, 00:10:42.207 "memory_domains": [ 00:10:42.207 { 00:10:42.207 "dma_device_id": "system", 00:10:42.207 "dma_device_type": 1 00:10:42.207 }, 00:10:42.207 { 00:10:42.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.207 "dma_device_type": 2 00:10:42.207 } 00:10:42.207 ], 00:10:42.207 "driver_specific": { 00:10:42.207 "passthru": { 00:10:42.207 "name": "pt1", 00:10:42.207 "base_bdev_name": "malloc1" 00:10:42.207 } 00:10:42.207 } 00:10:42.207 }' 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:42.207 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:42.465 13:29:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:42.723 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:42.723 "name": "pt2", 00:10:42.723 "aliases": [ 00:10:42.723 "00000000-0000-0000-0000-000000000002" 00:10:42.724 ], 00:10:42.724 "product_name": "passthru", 00:10:42.724 "block_size": 512, 00:10:42.724 "num_blocks": 65536, 00:10:42.724 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:42.724 "assigned_rate_limits": { 00:10:42.724 "rw_ios_per_sec": 0, 00:10:42.724 "rw_mbytes_per_sec": 0, 00:10:42.724 "r_mbytes_per_sec": 0, 00:10:42.724 "w_mbytes_per_sec": 0 00:10:42.724 }, 00:10:42.724 "claimed": true, 00:10:42.724 "claim_type": "exclusive_write", 00:10:42.724 "zoned": false, 00:10:42.724 "supported_io_types": { 00:10:42.724 "read": true, 00:10:42.724 "write": true, 00:10:42.724 "unmap": true, 00:10:42.724 "flush": true, 00:10:42.724 "reset": true, 00:10:42.724 "nvme_admin": false, 00:10:42.724 "nvme_io": false, 00:10:42.724 "nvme_io_md": false, 00:10:42.724 "write_zeroes": true, 00:10:42.724 "zcopy": true, 00:10:42.724 "get_zone_info": false, 00:10:42.724 "zone_management": false, 00:10:42.724 "zone_append": false, 00:10:42.724 "compare": false, 00:10:42.724 "compare_and_write": false, 00:10:42.724 "abort": true, 00:10:42.724 "seek_hole": false, 00:10:42.724 "seek_data": false, 00:10:42.724 "copy": true, 00:10:42.724 "nvme_iov_md": false 00:10:42.724 }, 00:10:42.724 "memory_domains": [ 00:10:42.724 { 00:10:42.724 "dma_device_id": "system", 00:10:42.724 "dma_device_type": 1 00:10:42.724 }, 00:10:42.724 { 00:10:42.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.724 "dma_device_type": 2 00:10:42.724 } 00:10:42.724 ], 00:10:42.724 "driver_specific": { 00:10:42.724 "passthru": { 00:10:42.724 "name": "pt2", 00:10:42.724 "base_bdev_name": "malloc2" 00:10:42.724 } 00:10:42.724 } 00:10:42.724 }' 00:10:42.724 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.724 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:42.724 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:42.724 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.724 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:42.982 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:43.241 [2024-07-15 13:29:22.583005] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:43.241 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3c989d8a-5b38-4025-8d26-45521b352829 00:10:43.241 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 3c989d8a-5b38-4025-8d26-45521b352829 ']' 00:10:43.241 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:43.500 [2024-07-15 13:29:22.831422] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:43.500 [2024-07-15 13:29:22.831441] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:43.500 [2024-07-15 13:29:22.831492] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:43.500 [2024-07-15 13:29:22.831535] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:43.500 [2024-07-15 13:29:22.831546] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x163f270 name raid_bdev1, state offline 00:10:43.500 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.500 13:29:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:43.759 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:43.759 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:43.759 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:43.759 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:44.016 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:44.016 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:44.274 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:44.274 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:44.532 13:29:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:44.790 [2024-07-15 13:29:24.062651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:44.790 [2024-07-15 13:29:24.064051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:44.790 [2024-07-15 13:29:24.064105] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:44.790 [2024-07-15 13:29:24.064145] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:44.790 [2024-07-15 13:29:24.064165] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:44.790 [2024-07-15 13:29:24.064175] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x163eff0 name raid_bdev1, state configuring 00:10:44.790 request: 00:10:44.790 { 00:10:44.790 "name": "raid_bdev1", 00:10:44.790 "raid_level": "raid0", 00:10:44.790 "base_bdevs": [ 00:10:44.790 "malloc1", 00:10:44.790 "malloc2" 00:10:44.790 ], 00:10:44.790 "strip_size_kb": 64, 00:10:44.790 "superblock": false, 00:10:44.790 "method": "bdev_raid_create", 00:10:44.790 "req_id": 1 00:10:44.790 } 00:10:44.790 Got JSON-RPC error response 00:10:44.790 response: 00:10:44.790 { 00:10:44.790 "code": -17, 00:10:44.790 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:44.790 } 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.790 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:45.049 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:45.049 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:45.049 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:45.308 [2024-07-15 13:29:24.539850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:45.308 [2024-07-15 13:29:24.539891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:45.308 [2024-07-15 13:29:24.539915] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x149b7a0 00:10:45.308 [2024-07-15 13:29:24.539933] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:45.308 [2024-07-15 13:29:24.541498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:45.308 [2024-07-15 13:29:24.541527] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:45.308 [2024-07-15 13:29:24.541591] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:45.308 [2024-07-15 13:29:24.541616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:45.308 pt1 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.308 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:45.567 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.567 "name": "raid_bdev1", 00:10:45.567 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:45.567 "strip_size_kb": 64, 00:10:45.567 "state": "configuring", 00:10:45.567 "raid_level": "raid0", 00:10:45.567 "superblock": true, 00:10:45.567 "num_base_bdevs": 2, 00:10:45.567 "num_base_bdevs_discovered": 1, 00:10:45.567 "num_base_bdevs_operational": 2, 00:10:45.567 "base_bdevs_list": [ 00:10:45.567 { 00:10:45.567 "name": "pt1", 00:10:45.567 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:45.567 "is_configured": true, 00:10:45.567 "data_offset": 2048, 00:10:45.567 "data_size": 63488 00:10:45.567 }, 00:10:45.567 { 00:10:45.567 "name": null, 00:10:45.567 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:45.567 "is_configured": false, 00:10:45.567 "data_offset": 2048, 00:10:45.567 "data_size": 63488 00:10:45.567 } 00:10:45.567 ] 00:10:45.567 }' 00:10:45.567 13:29:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.567 13:29:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.133 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:46.133 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:46.133 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:46.134 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:46.393 [2024-07-15 13:29:25.586658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:46.393 [2024-07-15 13:29:25.586705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:46.393 [2024-07-15 13:29:25.586724] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1635820 00:10:46.393 [2024-07-15 13:29:25.586737] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:46.393 [2024-07-15 13:29:25.587101] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:46.393 [2024-07-15 13:29:25.587122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:46.393 [2024-07-15 13:29:25.587186] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:46.393 [2024-07-15 13:29:25.587206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:46.393 [2024-07-15 13:29:25.587304] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1491ec0 00:10:46.393 [2024-07-15 13:29:25.587316] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:46.393 [2024-07-15 13:29:25.587487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1494530 00:10:46.393 [2024-07-15 13:29:25.587607] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1491ec0 00:10:46.393 [2024-07-15 13:29:25.587617] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1491ec0 00:10:46.393 [2024-07-15 13:29:25.587716] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.393 pt2 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.393 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:46.652 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:46.652 "name": "raid_bdev1", 00:10:46.652 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:46.652 "strip_size_kb": 64, 00:10:46.652 "state": "online", 00:10:46.652 "raid_level": "raid0", 00:10:46.652 "superblock": true, 00:10:46.652 "num_base_bdevs": 2, 00:10:46.652 "num_base_bdevs_discovered": 2, 00:10:46.652 "num_base_bdevs_operational": 2, 00:10:46.652 "base_bdevs_list": [ 00:10:46.652 { 00:10:46.652 "name": "pt1", 00:10:46.652 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:46.652 "is_configured": true, 00:10:46.652 "data_offset": 2048, 00:10:46.652 "data_size": 63488 00:10:46.652 }, 00:10:46.652 { 00:10:46.652 "name": "pt2", 00:10:46.652 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:46.652 "is_configured": true, 00:10:46.652 "data_offset": 2048, 00:10:46.652 "data_size": 63488 00:10:46.652 } 00:10:46.652 ] 00:10:46.652 }' 00:10:46.652 13:29:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:46.652 13:29:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:47.221 [2024-07-15 13:29:26.605609] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:47.221 "name": "raid_bdev1", 00:10:47.221 "aliases": [ 00:10:47.221 "3c989d8a-5b38-4025-8d26-45521b352829" 00:10:47.221 ], 00:10:47.221 "product_name": "Raid Volume", 00:10:47.221 "block_size": 512, 00:10:47.221 "num_blocks": 126976, 00:10:47.221 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:47.221 "assigned_rate_limits": { 00:10:47.221 "rw_ios_per_sec": 0, 00:10:47.221 "rw_mbytes_per_sec": 0, 00:10:47.221 "r_mbytes_per_sec": 0, 00:10:47.221 "w_mbytes_per_sec": 0 00:10:47.221 }, 00:10:47.221 "claimed": false, 00:10:47.221 "zoned": false, 00:10:47.221 "supported_io_types": { 00:10:47.221 "read": true, 00:10:47.221 "write": true, 00:10:47.221 "unmap": true, 00:10:47.221 "flush": true, 00:10:47.221 "reset": true, 00:10:47.221 "nvme_admin": false, 00:10:47.221 "nvme_io": false, 00:10:47.221 "nvme_io_md": false, 00:10:47.221 "write_zeroes": true, 00:10:47.221 "zcopy": false, 00:10:47.221 "get_zone_info": false, 00:10:47.221 "zone_management": false, 00:10:47.221 "zone_append": false, 00:10:47.221 "compare": false, 00:10:47.221 "compare_and_write": false, 00:10:47.221 "abort": false, 00:10:47.221 "seek_hole": false, 00:10:47.221 "seek_data": false, 00:10:47.221 "copy": false, 00:10:47.221 "nvme_iov_md": false 00:10:47.221 }, 00:10:47.221 "memory_domains": [ 00:10:47.221 { 00:10:47.221 "dma_device_id": "system", 00:10:47.221 "dma_device_type": 1 00:10:47.221 }, 00:10:47.221 { 00:10:47.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.221 "dma_device_type": 2 00:10:47.221 }, 00:10:47.221 { 00:10:47.221 "dma_device_id": "system", 00:10:47.221 "dma_device_type": 1 00:10:47.221 }, 00:10:47.221 { 00:10:47.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.221 "dma_device_type": 2 00:10:47.221 } 00:10:47.221 ], 00:10:47.221 "driver_specific": { 00:10:47.221 "raid": { 00:10:47.221 "uuid": "3c989d8a-5b38-4025-8d26-45521b352829", 00:10:47.221 "strip_size_kb": 64, 00:10:47.221 "state": "online", 00:10:47.221 "raid_level": "raid0", 00:10:47.221 "superblock": true, 00:10:47.221 "num_base_bdevs": 2, 00:10:47.221 "num_base_bdevs_discovered": 2, 00:10:47.221 "num_base_bdevs_operational": 2, 00:10:47.221 "base_bdevs_list": [ 00:10:47.221 { 00:10:47.221 "name": "pt1", 00:10:47.221 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:47.221 "is_configured": true, 00:10:47.221 "data_offset": 2048, 00:10:47.221 "data_size": 63488 00:10:47.221 }, 00:10:47.221 { 00:10:47.221 "name": "pt2", 00:10:47.221 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:47.221 "is_configured": true, 00:10:47.221 "data_offset": 2048, 00:10:47.221 "data_size": 63488 00:10:47.221 } 00:10:47.221 ] 00:10:47.221 } 00:10:47.221 } 00:10:47.221 }' 00:10:47.221 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:47.481 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:47.481 pt2' 00:10:47.481 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:47.481 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:47.481 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:47.740 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:47.740 "name": "pt1", 00:10:47.740 "aliases": [ 00:10:47.740 "00000000-0000-0000-0000-000000000001" 00:10:47.740 ], 00:10:47.740 "product_name": "passthru", 00:10:47.740 "block_size": 512, 00:10:47.740 "num_blocks": 65536, 00:10:47.740 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:47.740 "assigned_rate_limits": { 00:10:47.740 "rw_ios_per_sec": 0, 00:10:47.740 "rw_mbytes_per_sec": 0, 00:10:47.740 "r_mbytes_per_sec": 0, 00:10:47.740 "w_mbytes_per_sec": 0 00:10:47.740 }, 00:10:47.740 "claimed": true, 00:10:47.740 "claim_type": "exclusive_write", 00:10:47.740 "zoned": false, 00:10:47.740 "supported_io_types": { 00:10:47.740 "read": true, 00:10:47.740 "write": true, 00:10:47.740 "unmap": true, 00:10:47.740 "flush": true, 00:10:47.740 "reset": true, 00:10:47.740 "nvme_admin": false, 00:10:47.740 "nvme_io": false, 00:10:47.740 "nvme_io_md": false, 00:10:47.740 "write_zeroes": true, 00:10:47.740 "zcopy": true, 00:10:47.740 "get_zone_info": false, 00:10:47.740 "zone_management": false, 00:10:47.740 "zone_append": false, 00:10:47.740 "compare": false, 00:10:47.740 "compare_and_write": false, 00:10:47.740 "abort": true, 00:10:47.740 "seek_hole": false, 00:10:47.740 "seek_data": false, 00:10:47.740 "copy": true, 00:10:47.740 "nvme_iov_md": false 00:10:47.740 }, 00:10:47.740 "memory_domains": [ 00:10:47.740 { 00:10:47.740 "dma_device_id": "system", 00:10:47.740 "dma_device_type": 1 00:10:47.740 }, 00:10:47.740 { 00:10:47.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.740 "dma_device_type": 2 00:10:47.740 } 00:10:47.740 ], 00:10:47.740 "driver_specific": { 00:10:47.740 "passthru": { 00:10:47.740 "name": "pt1", 00:10:47.740 "base_bdev_name": "malloc1" 00:10:47.740 } 00:10:47.740 } 00:10:47.740 }' 00:10:47.740 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:47.740 13:29:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.740 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:47.998 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:48.256 "name": "pt2", 00:10:48.256 "aliases": [ 00:10:48.256 "00000000-0000-0000-0000-000000000002" 00:10:48.256 ], 00:10:48.256 "product_name": "passthru", 00:10:48.256 "block_size": 512, 00:10:48.256 "num_blocks": 65536, 00:10:48.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:48.256 "assigned_rate_limits": { 00:10:48.256 "rw_ios_per_sec": 0, 00:10:48.256 "rw_mbytes_per_sec": 0, 00:10:48.256 "r_mbytes_per_sec": 0, 00:10:48.256 "w_mbytes_per_sec": 0 00:10:48.256 }, 00:10:48.256 "claimed": true, 00:10:48.256 "claim_type": "exclusive_write", 00:10:48.256 "zoned": false, 00:10:48.256 "supported_io_types": { 00:10:48.256 "read": true, 00:10:48.256 "write": true, 00:10:48.256 "unmap": true, 00:10:48.256 "flush": true, 00:10:48.256 "reset": true, 00:10:48.256 "nvme_admin": false, 00:10:48.256 "nvme_io": false, 00:10:48.256 "nvme_io_md": false, 00:10:48.256 "write_zeroes": true, 00:10:48.256 "zcopy": true, 00:10:48.256 "get_zone_info": false, 00:10:48.256 "zone_management": false, 00:10:48.256 "zone_append": false, 00:10:48.256 "compare": false, 00:10:48.256 "compare_and_write": false, 00:10:48.256 "abort": true, 00:10:48.256 "seek_hole": false, 00:10:48.256 "seek_data": false, 00:10:48.256 "copy": true, 00:10:48.256 "nvme_iov_md": false 00:10:48.256 }, 00:10:48.256 "memory_domains": [ 00:10:48.256 { 00:10:48.256 "dma_device_id": "system", 00:10:48.256 "dma_device_type": 1 00:10:48.256 }, 00:10:48.256 { 00:10:48.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.256 "dma_device_type": 2 00:10:48.256 } 00:10:48.256 ], 00:10:48.256 "driver_specific": { 00:10:48.256 "passthru": { 00:10:48.256 "name": "pt2", 00:10:48.256 "base_bdev_name": "malloc2" 00:10:48.256 } 00:10:48.256 } 00:10:48.256 }' 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:48.256 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.514 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:48.514 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:48.514 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:48.514 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:48.514 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:48.515 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:48.515 13:29:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:48.773 [2024-07-15 13:29:28.081531] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 3c989d8a-5b38-4025-8d26-45521b352829 '!=' 3c989d8a-5b38-4025-8d26-45521b352829 ']' 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2069186 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2069186 ']' 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2069186 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2069186 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2069186' 00:10:48.773 killing process with pid 2069186 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2069186 00:10:48.773 [2024-07-15 13:29:28.154515] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:48.773 [2024-07-15 13:29:28.154570] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:48.773 [2024-07-15 13:29:28.154613] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:48.773 [2024-07-15 13:29:28.154625] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1491ec0 name raid_bdev1, state offline 00:10:48.773 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2069186 00:10:48.773 [2024-07-15 13:29:28.173580] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:49.031 13:29:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:49.031 00:10:49.031 real 0m10.543s 00:10:49.031 user 0m18.733s 00:10:49.031 sys 0m2.021s 00:10:49.031 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:49.031 13:29:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.031 ************************************ 00:10:49.031 END TEST raid_superblock_test 00:10:49.031 ************************************ 00:10:49.031 13:29:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:49.031 13:29:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:10:49.031 13:29:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:49.031 13:29:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:49.031 13:29:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:49.291 ************************************ 00:10:49.291 START TEST raid_read_error_test 00:10:49.291 ************************************ 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XVvKKVS8V6 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2070820 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2070820 /var/tmp/spdk-raid.sock 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2070820 ']' 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:49.291 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:49.291 13:29:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.291 [2024-07-15 13:29:28.560029] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:49.291 [2024-07-15 13:29:28.560100] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2070820 ] 00:10:49.291 [2024-07-15 13:29:28.691313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:49.551 [2024-07-15 13:29:28.795225] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.551 [2024-07-15 13:29:28.861254] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:49.551 [2024-07-15 13:29:28.861293] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.117 13:29:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:50.117 13:29:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:50.117 13:29:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:50.117 13:29:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:50.376 BaseBdev1_malloc 00:10:50.376 13:29:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:50.634 true 00:10:50.634 13:29:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:50.893 [2024-07-15 13:29:30.219594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:50.893 [2024-07-15 13:29:30.219643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:50.893 [2024-07-15 13:29:30.219666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18070d0 00:10:50.893 [2024-07-15 13:29:30.219680] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:50.893 [2024-07-15 13:29:30.221535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:50.893 [2024-07-15 13:29:30.221567] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:50.893 BaseBdev1 00:10:50.893 13:29:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:50.893 13:29:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:51.151 BaseBdev2_malloc 00:10:51.151 13:29:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:51.409 true 00:10:51.409 13:29:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:51.667 [2024-07-15 13:29:30.954204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:51.667 [2024-07-15 13:29:30.954260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:51.667 [2024-07-15 13:29:30.954281] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180b910 00:10:51.667 [2024-07-15 13:29:30.954294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:51.667 [2024-07-15 13:29:30.955700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:51.667 [2024-07-15 13:29:30.955729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:51.667 BaseBdev2 00:10:51.667 13:29:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:51.925 [2024-07-15 13:29:31.198874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:51.925 [2024-07-15 13:29:31.200089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:51.925 [2024-07-15 13:29:31.200274] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180d320 00:10:51.925 [2024-07-15 13:29:31.200287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:51.925 [2024-07-15 13:29:31.200468] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x180c270 00:10:51.925 [2024-07-15 13:29:31.200609] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180d320 00:10:51.925 [2024-07-15 13:29:31.200619] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x180d320 00:10:51.925 [2024-07-15 13:29:31.200717] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:51.925 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.182 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.182 "name": "raid_bdev1", 00:10:52.182 "uuid": "b7dda4e4-db11-4a3a-8072-93dc833e50e2", 00:10:52.182 "strip_size_kb": 64, 00:10:52.182 "state": "online", 00:10:52.182 "raid_level": "raid0", 00:10:52.183 "superblock": true, 00:10:52.183 "num_base_bdevs": 2, 00:10:52.183 "num_base_bdevs_discovered": 2, 00:10:52.183 "num_base_bdevs_operational": 2, 00:10:52.183 "base_bdevs_list": [ 00:10:52.183 { 00:10:52.183 "name": "BaseBdev1", 00:10:52.183 "uuid": "6276ddd2-5609-546d-8897-76285b85740a", 00:10:52.183 "is_configured": true, 00:10:52.183 "data_offset": 2048, 00:10:52.183 "data_size": 63488 00:10:52.183 }, 00:10:52.183 { 00:10:52.183 "name": "BaseBdev2", 00:10:52.183 "uuid": "be078829-2b81-5e82-8960-e788d0ee675f", 00:10:52.183 "is_configured": true, 00:10:52.183 "data_offset": 2048, 00:10:52.183 "data_size": 63488 00:10:52.183 } 00:10:52.183 ] 00:10:52.183 }' 00:10:52.183 13:29:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.183 13:29:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.747 13:29:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:52.747 13:29:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:52.747 [2024-07-15 13:29:32.169718] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18089b0 00:10:53.700 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.957 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:54.215 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.215 "name": "raid_bdev1", 00:10:54.215 "uuid": "b7dda4e4-db11-4a3a-8072-93dc833e50e2", 00:10:54.215 "strip_size_kb": 64, 00:10:54.215 "state": "online", 00:10:54.215 "raid_level": "raid0", 00:10:54.215 "superblock": true, 00:10:54.215 "num_base_bdevs": 2, 00:10:54.215 "num_base_bdevs_discovered": 2, 00:10:54.215 "num_base_bdevs_operational": 2, 00:10:54.215 "base_bdevs_list": [ 00:10:54.215 { 00:10:54.215 "name": "BaseBdev1", 00:10:54.215 "uuid": "6276ddd2-5609-546d-8897-76285b85740a", 00:10:54.215 "is_configured": true, 00:10:54.215 "data_offset": 2048, 00:10:54.215 "data_size": 63488 00:10:54.215 }, 00:10:54.215 { 00:10:54.215 "name": "BaseBdev2", 00:10:54.215 "uuid": "be078829-2b81-5e82-8960-e788d0ee675f", 00:10:54.215 "is_configured": true, 00:10:54.215 "data_offset": 2048, 00:10:54.215 "data_size": 63488 00:10:54.215 } 00:10:54.215 ] 00:10:54.215 }' 00:10:54.215 13:29:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.215 13:29:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.783 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:55.041 [2024-07-15 13:29:34.289446] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:55.041 [2024-07-15 13:29:34.289486] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:55.041 [2024-07-15 13:29:34.292649] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:55.041 [2024-07-15 13:29:34.292680] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:55.041 [2024-07-15 13:29:34.292707] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:55.041 [2024-07-15 13:29:34.292719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180d320 name raid_bdev1, state offline 00:10:55.041 0 00:10:55.041 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2070820 00:10:55.041 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2070820 ']' 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2070820 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2070820 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2070820' 00:10:55.042 killing process with pid 2070820 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2070820 00:10:55.042 [2024-07-15 13:29:34.357061] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:55.042 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2070820 00:10:55.042 [2024-07-15 13:29:34.367728] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XVvKKVS8V6 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:10:55.301 00:10:55.301 real 0m6.126s 00:10:55.301 user 0m9.528s 00:10:55.301 sys 0m1.082s 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:55.301 13:29:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.301 ************************************ 00:10:55.301 END TEST raid_read_error_test 00:10:55.301 ************************************ 00:10:55.301 13:29:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:55.301 13:29:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:10:55.301 13:29:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:55.301 13:29:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:55.301 13:29:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:55.301 ************************************ 00:10:55.301 START TEST raid_write_error_test 00:10:55.301 ************************************ 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TQBtjRsYjF 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2071790 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2071790 /var/tmp/spdk-raid.sock 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2071790 ']' 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:55.301 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.301 13:29:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:55.561 [2024-07-15 13:29:34.757047] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:10:55.561 [2024-07-15 13:29:34.757118] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2071790 ] 00:10:55.561 [2024-07-15 13:29:34.885968] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:55.820 [2024-07-15 13:29:34.992564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:55.820 [2024-07-15 13:29:35.059729] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:55.820 [2024-07-15 13:29:35.059774] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:56.755 13:29:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:56.755 13:29:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:56.755 13:29:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:56.755 13:29:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:57.044 BaseBdev1_malloc 00:10:57.303 13:29:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:57.303 true 00:10:57.303 13:29:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:57.871 [2024-07-15 13:29:37.190415] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:57.871 [2024-07-15 13:29:37.190466] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:57.871 [2024-07-15 13:29:37.190487] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2b0d0 00:10:57.871 [2024-07-15 13:29:37.190500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:57.871 [2024-07-15 13:29:37.192397] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:57.871 [2024-07-15 13:29:37.192429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:57.871 BaseBdev1 00:10:57.871 13:29:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:57.871 13:29:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:58.129 BaseBdev2_malloc 00:10:58.129 13:29:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:58.698 true 00:10:58.698 13:29:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:58.956 [2024-07-15 13:29:38.209671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:58.956 [2024-07-15 13:29:38.209715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.956 [2024-07-15 13:29:38.209737] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2f910 00:10:58.956 [2024-07-15 13:29:38.209750] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.957 [2024-07-15 13:29:38.211349] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.957 [2024-07-15 13:29:38.211379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:58.957 BaseBdev2 00:10:58.957 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:59.525 [2024-07-15 13:29:38.711010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:59.525 [2024-07-15 13:29:38.712390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:59.525 [2024-07-15 13:29:38.712593] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb31320 00:10:59.525 [2024-07-15 13:29:38.712606] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:59.525 [2024-07-15 13:29:38.712803] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb30270 00:10:59.525 [2024-07-15 13:29:38.712957] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb31320 00:10:59.525 [2024-07-15 13:29:38.712968] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb31320 00:10:59.525 [2024-07-15 13:29:38.713075] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.525 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:59.784 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.784 "name": "raid_bdev1", 00:10:59.784 "uuid": "9682fb39-cff8-4b56-8a53-76b7cb19f1c1", 00:10:59.784 "strip_size_kb": 64, 00:10:59.784 "state": "online", 00:10:59.784 "raid_level": "raid0", 00:10:59.784 "superblock": true, 00:10:59.784 "num_base_bdevs": 2, 00:10:59.784 "num_base_bdevs_discovered": 2, 00:10:59.784 "num_base_bdevs_operational": 2, 00:10:59.784 "base_bdevs_list": [ 00:10:59.784 { 00:10:59.784 "name": "BaseBdev1", 00:10:59.784 "uuid": "7132a787-b504-509f-901c-6b64b95f7455", 00:10:59.784 "is_configured": true, 00:10:59.784 "data_offset": 2048, 00:10:59.784 "data_size": 63488 00:10:59.784 }, 00:10:59.784 { 00:10:59.784 "name": "BaseBdev2", 00:10:59.784 "uuid": "52562287-36f7-50b5-a19d-1d37c8306754", 00:10:59.784 "is_configured": true, 00:10:59.784 "data_offset": 2048, 00:10:59.784 "data_size": 63488 00:10:59.784 } 00:10:59.784 ] 00:10:59.784 }' 00:10:59.784 13:29:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.784 13:29:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.352 13:29:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:00.352 13:29:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:00.352 [2024-07-15 13:29:39.693870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2c9b0 00:11:01.289 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.548 13:29:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:01.807 13:29:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.807 "name": "raid_bdev1", 00:11:01.807 "uuid": "9682fb39-cff8-4b56-8a53-76b7cb19f1c1", 00:11:01.807 "strip_size_kb": 64, 00:11:01.807 "state": "online", 00:11:01.807 "raid_level": "raid0", 00:11:01.807 "superblock": true, 00:11:01.807 "num_base_bdevs": 2, 00:11:01.807 "num_base_bdevs_discovered": 2, 00:11:01.807 "num_base_bdevs_operational": 2, 00:11:01.807 "base_bdevs_list": [ 00:11:01.807 { 00:11:01.807 "name": "BaseBdev1", 00:11:01.807 "uuid": "7132a787-b504-509f-901c-6b64b95f7455", 00:11:01.807 "is_configured": true, 00:11:01.807 "data_offset": 2048, 00:11:01.807 "data_size": 63488 00:11:01.807 }, 00:11:01.807 { 00:11:01.807 "name": "BaseBdev2", 00:11:01.807 "uuid": "52562287-36f7-50b5-a19d-1d37c8306754", 00:11:01.807 "is_configured": true, 00:11:01.807 "data_offset": 2048, 00:11:01.807 "data_size": 63488 00:11:01.807 } 00:11:01.807 ] 00:11:01.807 }' 00:11:01.807 13:29:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.807 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.374 13:29:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:02.633 [2024-07-15 13:29:41.918218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:02.633 [2024-07-15 13:29:41.918255] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:02.633 [2024-07-15 13:29:41.921415] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:02.633 [2024-07-15 13:29:41.921448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:02.633 [2024-07-15 13:29:41.921484] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:02.633 [2024-07-15 13:29:41.921496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb31320 name raid_bdev1, state offline 00:11:02.633 0 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2071790 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2071790 ']' 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2071790 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2071790 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2071790' 00:11:02.633 killing process with pid 2071790 00:11:02.633 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2071790 00:11:02.633 [2024-07-15 13:29:41.988305] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:02.634 13:29:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2071790 00:11:02.634 [2024-07-15 13:29:41.998623] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TQBtjRsYjF 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:02.893 00:11:02.893 real 0m7.545s 00:11:02.893 user 0m12.152s 00:11:02.893 sys 0m1.302s 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:02.893 13:29:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.893 ************************************ 00:11:02.893 END TEST raid_write_error_test 00:11:02.893 ************************************ 00:11:02.893 13:29:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:02.893 13:29:42 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:02.893 13:29:42 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:02.893 13:29:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:02.893 13:29:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:02.893 13:29:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:02.893 ************************************ 00:11:02.893 START TEST raid_state_function_test 00:11:02.893 ************************************ 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:02.893 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2072776 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2072776' 00:11:03.153 Process raid pid: 2072776 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2072776 /var/tmp/spdk-raid.sock 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2072776 ']' 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:03.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.153 13:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:03.153 [2024-07-15 13:29:42.381203] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:03.153 [2024-07-15 13:29:42.381274] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:03.153 [2024-07-15 13:29:42.511600] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:03.412 [2024-07-15 13:29:42.618183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:03.412 [2024-07-15 13:29:42.688312] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:03.412 [2024-07-15 13:29:42.688349] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:04.348 13:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:04.348 13:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:04.348 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:04.606 [2024-07-15 13:29:43.808218] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:04.606 [2024-07-15 13:29:43.808266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:04.606 [2024-07-15 13:29:43.808277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:04.606 [2024-07-15 13:29:43.808289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.606 13:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.865 13:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.865 "name": "Existed_Raid", 00:11:04.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.865 "strip_size_kb": 64, 00:11:04.865 "state": "configuring", 00:11:04.865 "raid_level": "concat", 00:11:04.865 "superblock": false, 00:11:04.865 "num_base_bdevs": 2, 00:11:04.865 "num_base_bdevs_discovered": 0, 00:11:04.865 "num_base_bdevs_operational": 2, 00:11:04.865 "base_bdevs_list": [ 00:11:04.865 { 00:11:04.865 "name": "BaseBdev1", 00:11:04.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.865 "is_configured": false, 00:11:04.865 "data_offset": 0, 00:11:04.865 "data_size": 0 00:11:04.865 }, 00:11:04.865 { 00:11:04.865 "name": "BaseBdev2", 00:11:04.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.865 "is_configured": false, 00:11:04.865 "data_offset": 0, 00:11:04.865 "data_size": 0 00:11:04.865 } 00:11:04.865 ] 00:11:04.865 }' 00:11:04.865 13:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.865 13:29:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.432 13:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:05.432 [2024-07-15 13:29:44.834781] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:05.432 [2024-07-15 13:29:44.834813] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162ea80 name Existed_Raid, state configuring 00:11:05.432 13:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:05.690 [2024-07-15 13:29:45.011270] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:05.690 [2024-07-15 13:29:45.011297] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:05.690 [2024-07-15 13:29:45.011307] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:05.690 [2024-07-15 13:29:45.011318] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:05.690 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:05.949 [2024-07-15 13:29:45.201712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:05.949 BaseBdev1 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:05.949 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:06.208 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:06.208 [ 00:11:06.208 { 00:11:06.208 "name": "BaseBdev1", 00:11:06.208 "aliases": [ 00:11:06.208 "50494dd0-01fa-4074-b08b-0fe105f1ae04" 00:11:06.208 ], 00:11:06.208 "product_name": "Malloc disk", 00:11:06.208 "block_size": 512, 00:11:06.208 "num_blocks": 65536, 00:11:06.208 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:06.208 "assigned_rate_limits": { 00:11:06.208 "rw_ios_per_sec": 0, 00:11:06.208 "rw_mbytes_per_sec": 0, 00:11:06.208 "r_mbytes_per_sec": 0, 00:11:06.208 "w_mbytes_per_sec": 0 00:11:06.208 }, 00:11:06.208 "claimed": true, 00:11:06.208 "claim_type": "exclusive_write", 00:11:06.208 "zoned": false, 00:11:06.208 "supported_io_types": { 00:11:06.208 "read": true, 00:11:06.208 "write": true, 00:11:06.208 "unmap": true, 00:11:06.208 "flush": true, 00:11:06.208 "reset": true, 00:11:06.208 "nvme_admin": false, 00:11:06.208 "nvme_io": false, 00:11:06.208 "nvme_io_md": false, 00:11:06.208 "write_zeroes": true, 00:11:06.208 "zcopy": true, 00:11:06.208 "get_zone_info": false, 00:11:06.208 "zone_management": false, 00:11:06.208 "zone_append": false, 00:11:06.208 "compare": false, 00:11:06.208 "compare_and_write": false, 00:11:06.208 "abort": true, 00:11:06.208 "seek_hole": false, 00:11:06.209 "seek_data": false, 00:11:06.209 "copy": true, 00:11:06.209 "nvme_iov_md": false 00:11:06.209 }, 00:11:06.209 "memory_domains": [ 00:11:06.209 { 00:11:06.209 "dma_device_id": "system", 00:11:06.209 "dma_device_type": 1 00:11:06.209 }, 00:11:06.209 { 00:11:06.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.209 "dma_device_type": 2 00:11:06.209 } 00:11:06.209 ], 00:11:06.209 "driver_specific": {} 00:11:06.209 } 00:11:06.209 ] 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.468 "name": "Existed_Raid", 00:11:06.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.468 "strip_size_kb": 64, 00:11:06.468 "state": "configuring", 00:11:06.468 "raid_level": "concat", 00:11:06.468 "superblock": false, 00:11:06.468 "num_base_bdevs": 2, 00:11:06.468 "num_base_bdevs_discovered": 1, 00:11:06.468 "num_base_bdevs_operational": 2, 00:11:06.468 "base_bdevs_list": [ 00:11:06.468 { 00:11:06.468 "name": "BaseBdev1", 00:11:06.468 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:06.468 "is_configured": true, 00:11:06.468 "data_offset": 0, 00:11:06.468 "data_size": 65536 00:11:06.468 }, 00:11:06.468 { 00:11:06.468 "name": "BaseBdev2", 00:11:06.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:06.468 "is_configured": false, 00:11:06.468 "data_offset": 0, 00:11:06.468 "data_size": 0 00:11:06.468 } 00:11:06.468 ] 00:11:06.468 }' 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.468 13:29:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.404 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:07.404 [2024-07-15 13:29:46.621479] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:07.404 [2024-07-15 13:29:46.621519] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162e350 name Existed_Raid, state configuring 00:11:07.404 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:07.663 [2024-07-15 13:29:46.870161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:07.663 [2024-07-15 13:29:46.871653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:07.663 [2024-07-15 13:29:46.871685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.663 13:29:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.663 13:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.663 "name": "Existed_Raid", 00:11:07.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.663 "strip_size_kb": 64, 00:11:07.663 "state": "configuring", 00:11:07.663 "raid_level": "concat", 00:11:07.663 "superblock": false, 00:11:07.663 "num_base_bdevs": 2, 00:11:07.663 "num_base_bdevs_discovered": 1, 00:11:07.663 "num_base_bdevs_operational": 2, 00:11:07.663 "base_bdevs_list": [ 00:11:07.663 { 00:11:07.663 "name": "BaseBdev1", 00:11:07.663 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:07.663 "is_configured": true, 00:11:07.663 "data_offset": 0, 00:11:07.663 "data_size": 65536 00:11:07.663 }, 00:11:07.663 { 00:11:07.663 "name": "BaseBdev2", 00:11:07.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.663 "is_configured": false, 00:11:07.663 "data_offset": 0, 00:11:07.663 "data_size": 0 00:11:07.663 } 00:11:07.663 ] 00:11:07.663 }' 00:11:07.663 13:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.663 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.231 13:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:08.489 [2024-07-15 13:29:47.856214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:08.489 [2024-07-15 13:29:47.856254] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x162f000 00:11:08.489 [2024-07-15 13:29:47.856263] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:08.489 [2024-07-15 13:29:47.856454] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15490c0 00:11:08.489 [2024-07-15 13:29:47.856576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x162f000 00:11:08.489 [2024-07-15 13:29:47.856586] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x162f000 00:11:08.489 [2024-07-15 13:29:47.856752] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:08.489 BaseBdev2 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:08.489 13:29:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:08.748 13:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:09.006 [ 00:11:09.006 { 00:11:09.006 "name": "BaseBdev2", 00:11:09.006 "aliases": [ 00:11:09.006 "541001e6-c359-4564-87dc-d0ab20107328" 00:11:09.006 ], 00:11:09.006 "product_name": "Malloc disk", 00:11:09.006 "block_size": 512, 00:11:09.006 "num_blocks": 65536, 00:11:09.006 "uuid": "541001e6-c359-4564-87dc-d0ab20107328", 00:11:09.006 "assigned_rate_limits": { 00:11:09.006 "rw_ios_per_sec": 0, 00:11:09.006 "rw_mbytes_per_sec": 0, 00:11:09.006 "r_mbytes_per_sec": 0, 00:11:09.006 "w_mbytes_per_sec": 0 00:11:09.006 }, 00:11:09.006 "claimed": true, 00:11:09.006 "claim_type": "exclusive_write", 00:11:09.006 "zoned": false, 00:11:09.006 "supported_io_types": { 00:11:09.006 "read": true, 00:11:09.006 "write": true, 00:11:09.006 "unmap": true, 00:11:09.006 "flush": true, 00:11:09.006 "reset": true, 00:11:09.006 "nvme_admin": false, 00:11:09.006 "nvme_io": false, 00:11:09.006 "nvme_io_md": false, 00:11:09.006 "write_zeroes": true, 00:11:09.006 "zcopy": true, 00:11:09.006 "get_zone_info": false, 00:11:09.006 "zone_management": false, 00:11:09.006 "zone_append": false, 00:11:09.006 "compare": false, 00:11:09.006 "compare_and_write": false, 00:11:09.006 "abort": true, 00:11:09.006 "seek_hole": false, 00:11:09.006 "seek_data": false, 00:11:09.006 "copy": true, 00:11:09.006 "nvme_iov_md": false 00:11:09.006 }, 00:11:09.006 "memory_domains": [ 00:11:09.006 { 00:11:09.006 "dma_device_id": "system", 00:11:09.006 "dma_device_type": 1 00:11:09.006 }, 00:11:09.006 { 00:11:09.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.006 "dma_device_type": 2 00:11:09.006 } 00:11:09.006 ], 00:11:09.006 "driver_specific": {} 00:11:09.006 } 00:11:09.006 ] 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.006 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.265 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.265 "name": "Existed_Raid", 00:11:09.265 "uuid": "19d66d11-4b3d-4f10-80cf-141c35a3fa2f", 00:11:09.265 "strip_size_kb": 64, 00:11:09.265 "state": "online", 00:11:09.265 "raid_level": "concat", 00:11:09.265 "superblock": false, 00:11:09.265 "num_base_bdevs": 2, 00:11:09.265 "num_base_bdevs_discovered": 2, 00:11:09.265 "num_base_bdevs_operational": 2, 00:11:09.265 "base_bdevs_list": [ 00:11:09.265 { 00:11:09.265 "name": "BaseBdev1", 00:11:09.265 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:09.265 "is_configured": true, 00:11:09.265 "data_offset": 0, 00:11:09.265 "data_size": 65536 00:11:09.265 }, 00:11:09.265 { 00:11:09.265 "name": "BaseBdev2", 00:11:09.265 "uuid": "541001e6-c359-4564-87dc-d0ab20107328", 00:11:09.265 "is_configured": true, 00:11:09.265 "data_offset": 0, 00:11:09.265 "data_size": 65536 00:11:09.265 } 00:11:09.265 ] 00:11:09.265 }' 00:11:09.265 13:29:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.265 13:29:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:09.833 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:10.091 [2024-07-15 13:29:49.432673] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:10.091 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:10.091 "name": "Existed_Raid", 00:11:10.091 "aliases": [ 00:11:10.091 "19d66d11-4b3d-4f10-80cf-141c35a3fa2f" 00:11:10.091 ], 00:11:10.091 "product_name": "Raid Volume", 00:11:10.091 "block_size": 512, 00:11:10.091 "num_blocks": 131072, 00:11:10.091 "uuid": "19d66d11-4b3d-4f10-80cf-141c35a3fa2f", 00:11:10.091 "assigned_rate_limits": { 00:11:10.091 "rw_ios_per_sec": 0, 00:11:10.091 "rw_mbytes_per_sec": 0, 00:11:10.091 "r_mbytes_per_sec": 0, 00:11:10.091 "w_mbytes_per_sec": 0 00:11:10.091 }, 00:11:10.091 "claimed": false, 00:11:10.091 "zoned": false, 00:11:10.091 "supported_io_types": { 00:11:10.091 "read": true, 00:11:10.091 "write": true, 00:11:10.091 "unmap": true, 00:11:10.091 "flush": true, 00:11:10.091 "reset": true, 00:11:10.091 "nvme_admin": false, 00:11:10.091 "nvme_io": false, 00:11:10.091 "nvme_io_md": false, 00:11:10.091 "write_zeroes": true, 00:11:10.091 "zcopy": false, 00:11:10.091 "get_zone_info": false, 00:11:10.091 "zone_management": false, 00:11:10.091 "zone_append": false, 00:11:10.091 "compare": false, 00:11:10.091 "compare_and_write": false, 00:11:10.091 "abort": false, 00:11:10.091 "seek_hole": false, 00:11:10.091 "seek_data": false, 00:11:10.091 "copy": false, 00:11:10.091 "nvme_iov_md": false 00:11:10.091 }, 00:11:10.092 "memory_domains": [ 00:11:10.092 { 00:11:10.092 "dma_device_id": "system", 00:11:10.092 "dma_device_type": 1 00:11:10.092 }, 00:11:10.092 { 00:11:10.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:10.092 "dma_device_type": 2 00:11:10.092 }, 00:11:10.092 { 00:11:10.092 "dma_device_id": "system", 00:11:10.092 "dma_device_type": 1 00:11:10.092 }, 00:11:10.092 { 00:11:10.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:10.092 "dma_device_type": 2 00:11:10.092 } 00:11:10.092 ], 00:11:10.092 "driver_specific": { 00:11:10.092 "raid": { 00:11:10.092 "uuid": "19d66d11-4b3d-4f10-80cf-141c35a3fa2f", 00:11:10.092 "strip_size_kb": 64, 00:11:10.092 "state": "online", 00:11:10.092 "raid_level": "concat", 00:11:10.092 "superblock": false, 00:11:10.092 "num_base_bdevs": 2, 00:11:10.092 "num_base_bdevs_discovered": 2, 00:11:10.092 "num_base_bdevs_operational": 2, 00:11:10.092 "base_bdevs_list": [ 00:11:10.092 { 00:11:10.092 "name": "BaseBdev1", 00:11:10.092 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:10.092 "is_configured": true, 00:11:10.092 "data_offset": 0, 00:11:10.092 "data_size": 65536 00:11:10.092 }, 00:11:10.092 { 00:11:10.092 "name": "BaseBdev2", 00:11:10.092 "uuid": "541001e6-c359-4564-87dc-d0ab20107328", 00:11:10.092 "is_configured": true, 00:11:10.092 "data_offset": 0, 00:11:10.092 "data_size": 65536 00:11:10.092 } 00:11:10.092 ] 00:11:10.092 } 00:11:10.092 } 00:11:10.092 }' 00:11:10.092 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:10.092 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:10.092 BaseBdev2' 00:11:10.092 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:10.092 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:10.092 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:10.350 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:10.350 "name": "BaseBdev1", 00:11:10.350 "aliases": [ 00:11:10.350 "50494dd0-01fa-4074-b08b-0fe105f1ae04" 00:11:10.350 ], 00:11:10.350 "product_name": "Malloc disk", 00:11:10.350 "block_size": 512, 00:11:10.350 "num_blocks": 65536, 00:11:10.350 "uuid": "50494dd0-01fa-4074-b08b-0fe105f1ae04", 00:11:10.350 "assigned_rate_limits": { 00:11:10.350 "rw_ios_per_sec": 0, 00:11:10.350 "rw_mbytes_per_sec": 0, 00:11:10.350 "r_mbytes_per_sec": 0, 00:11:10.350 "w_mbytes_per_sec": 0 00:11:10.350 }, 00:11:10.350 "claimed": true, 00:11:10.350 "claim_type": "exclusive_write", 00:11:10.350 "zoned": false, 00:11:10.350 "supported_io_types": { 00:11:10.350 "read": true, 00:11:10.350 "write": true, 00:11:10.350 "unmap": true, 00:11:10.350 "flush": true, 00:11:10.350 "reset": true, 00:11:10.350 "nvme_admin": false, 00:11:10.350 "nvme_io": false, 00:11:10.350 "nvme_io_md": false, 00:11:10.350 "write_zeroes": true, 00:11:10.350 "zcopy": true, 00:11:10.350 "get_zone_info": false, 00:11:10.350 "zone_management": false, 00:11:10.350 "zone_append": false, 00:11:10.350 "compare": false, 00:11:10.350 "compare_and_write": false, 00:11:10.350 "abort": true, 00:11:10.350 "seek_hole": false, 00:11:10.350 "seek_data": false, 00:11:10.350 "copy": true, 00:11:10.350 "nvme_iov_md": false 00:11:10.350 }, 00:11:10.350 "memory_domains": [ 00:11:10.350 { 00:11:10.350 "dma_device_id": "system", 00:11:10.350 "dma_device_type": 1 00:11:10.350 }, 00:11:10.350 { 00:11:10.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:10.350 "dma_device_type": 2 00:11:10.350 } 00:11:10.350 ], 00:11:10.350 "driver_specific": {} 00:11:10.350 }' 00:11:10.350 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:10.608 13:29:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:10.866 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:10.866 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:10.866 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:10.866 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:10.866 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:11.125 "name": "BaseBdev2", 00:11:11.125 "aliases": [ 00:11:11.125 "541001e6-c359-4564-87dc-d0ab20107328" 00:11:11.125 ], 00:11:11.125 "product_name": "Malloc disk", 00:11:11.125 "block_size": 512, 00:11:11.125 "num_blocks": 65536, 00:11:11.125 "uuid": "541001e6-c359-4564-87dc-d0ab20107328", 00:11:11.125 "assigned_rate_limits": { 00:11:11.125 "rw_ios_per_sec": 0, 00:11:11.125 "rw_mbytes_per_sec": 0, 00:11:11.125 "r_mbytes_per_sec": 0, 00:11:11.125 "w_mbytes_per_sec": 0 00:11:11.125 }, 00:11:11.125 "claimed": true, 00:11:11.125 "claim_type": "exclusive_write", 00:11:11.125 "zoned": false, 00:11:11.125 "supported_io_types": { 00:11:11.125 "read": true, 00:11:11.125 "write": true, 00:11:11.125 "unmap": true, 00:11:11.125 "flush": true, 00:11:11.125 "reset": true, 00:11:11.125 "nvme_admin": false, 00:11:11.125 "nvme_io": false, 00:11:11.125 "nvme_io_md": false, 00:11:11.125 "write_zeroes": true, 00:11:11.125 "zcopy": true, 00:11:11.125 "get_zone_info": false, 00:11:11.125 "zone_management": false, 00:11:11.125 "zone_append": false, 00:11:11.125 "compare": false, 00:11:11.125 "compare_and_write": false, 00:11:11.125 "abort": true, 00:11:11.125 "seek_hole": false, 00:11:11.125 "seek_data": false, 00:11:11.125 "copy": true, 00:11:11.125 "nvme_iov_md": false 00:11:11.125 }, 00:11:11.125 "memory_domains": [ 00:11:11.125 { 00:11:11.125 "dma_device_id": "system", 00:11:11.125 "dma_device_type": 1 00:11:11.125 }, 00:11:11.125 { 00:11:11.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.125 "dma_device_type": 2 00:11:11.125 } 00:11:11.125 ], 00:11:11.125 "driver_specific": {} 00:11:11.125 }' 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:11.125 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:11.383 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:11.384 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:11.384 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:11.384 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:11.384 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:11.643 [2024-07-15 13:29:50.892355] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:11.643 [2024-07-15 13:29:50.892385] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:11.643 [2024-07-15 13:29:50.892426] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.643 13:29:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:11.902 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.902 "name": "Existed_Raid", 00:11:11.902 "uuid": "19d66d11-4b3d-4f10-80cf-141c35a3fa2f", 00:11:11.902 "strip_size_kb": 64, 00:11:11.902 "state": "offline", 00:11:11.902 "raid_level": "concat", 00:11:11.902 "superblock": false, 00:11:11.902 "num_base_bdevs": 2, 00:11:11.902 "num_base_bdevs_discovered": 1, 00:11:11.902 "num_base_bdevs_operational": 1, 00:11:11.902 "base_bdevs_list": [ 00:11:11.902 { 00:11:11.902 "name": null, 00:11:11.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.902 "is_configured": false, 00:11:11.902 "data_offset": 0, 00:11:11.902 "data_size": 65536 00:11:11.902 }, 00:11:11.902 { 00:11:11.902 "name": "BaseBdev2", 00:11:11.902 "uuid": "541001e6-c359-4564-87dc-d0ab20107328", 00:11:11.902 "is_configured": true, 00:11:11.902 "data_offset": 0, 00:11:11.902 "data_size": 65536 00:11:11.902 } 00:11:11.902 ] 00:11:11.902 }' 00:11:11.902 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.902 13:29:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.469 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:12.469 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:12.469 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:12.469 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.727 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:12.727 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:12.727 13:29:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:12.986 [2024-07-15 13:29:52.220956] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:12.986 [2024-07-15 13:29:52.221013] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162f000 name Existed_Raid, state offline 00:11:12.986 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:12.986 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:12.986 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.986 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2072776 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2072776 ']' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2072776 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2072776 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2072776' 00:11:13.244 killing process with pid 2072776 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2072776 00:11:13.244 [2024-07-15 13:29:52.541735] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:13.244 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2072776 00:11:13.244 [2024-07-15 13:29:52.542730] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:13.505 00:11:13.505 real 0m10.455s 00:11:13.505 user 0m18.551s 00:11:13.505 sys 0m1.949s 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.505 ************************************ 00:11:13.505 END TEST raid_state_function_test 00:11:13.505 ************************************ 00:11:13.505 13:29:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:13.505 13:29:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:13.505 13:29:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:13.505 13:29:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:13.505 13:29:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:13.505 ************************************ 00:11:13.505 START TEST raid_state_function_test_sb 00:11:13.505 ************************************ 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2074405 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2074405' 00:11:13.505 Process raid pid: 2074405 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2074405 /var/tmp/spdk-raid.sock 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2074405 ']' 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:13.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:13.505 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:13.505 [2024-07-15 13:29:52.912975] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:13.505 [2024-07-15 13:29:52.913043] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:13.804 [2024-07-15 13:29:53.042807] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:13.804 [2024-07-15 13:29:53.145125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:13.804 [2024-07-15 13:29:53.211574] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:13.804 [2024-07-15 13:29:53.211608] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.740 13:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:14.740 13:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:14.740 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:14.740 [2024-07-15 13:29:54.071468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:14.740 [2024-07-15 13:29:54.071513] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:14.740 [2024-07-15 13:29:54.071524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:14.740 [2024-07-15 13:29:54.071536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.740 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:14.999 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:14.999 "name": "Existed_Raid", 00:11:14.999 "uuid": "7a8b589a-1199-439e-bea0-5a0455129a33", 00:11:14.999 "strip_size_kb": 64, 00:11:14.999 "state": "configuring", 00:11:14.999 "raid_level": "concat", 00:11:14.999 "superblock": true, 00:11:14.999 "num_base_bdevs": 2, 00:11:14.999 "num_base_bdevs_discovered": 0, 00:11:14.999 "num_base_bdevs_operational": 2, 00:11:14.999 "base_bdevs_list": [ 00:11:14.999 { 00:11:14.999 "name": "BaseBdev1", 00:11:15.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.000 "is_configured": false, 00:11:15.000 "data_offset": 0, 00:11:15.000 "data_size": 0 00:11:15.000 }, 00:11:15.000 { 00:11:15.000 "name": "BaseBdev2", 00:11:15.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.000 "is_configured": false, 00:11:15.000 "data_offset": 0, 00:11:15.000 "data_size": 0 00:11:15.000 } 00:11:15.000 ] 00:11:15.000 }' 00:11:15.000 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.000 13:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.567 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:15.826 [2024-07-15 13:29:55.154180] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:15.826 [2024-07-15 13:29:55.154208] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e4ca80 name Existed_Raid, state configuring 00:11:15.826 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:16.085 [2024-07-15 13:29:55.406870] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:16.085 [2024-07-15 13:29:55.406900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:16.085 [2024-07-15 13:29:55.406910] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:16.085 [2024-07-15 13:29:55.406922] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:16.085 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:16.344 [2024-07-15 13:29:55.658578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:16.344 BaseBdev1 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.344 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:16.602 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:16.861 [ 00:11:16.861 { 00:11:16.861 "name": "BaseBdev1", 00:11:16.861 "aliases": [ 00:11:16.861 "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9" 00:11:16.861 ], 00:11:16.861 "product_name": "Malloc disk", 00:11:16.861 "block_size": 512, 00:11:16.861 "num_blocks": 65536, 00:11:16.861 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:16.861 "assigned_rate_limits": { 00:11:16.861 "rw_ios_per_sec": 0, 00:11:16.861 "rw_mbytes_per_sec": 0, 00:11:16.861 "r_mbytes_per_sec": 0, 00:11:16.861 "w_mbytes_per_sec": 0 00:11:16.861 }, 00:11:16.861 "claimed": true, 00:11:16.861 "claim_type": "exclusive_write", 00:11:16.861 "zoned": false, 00:11:16.861 "supported_io_types": { 00:11:16.861 "read": true, 00:11:16.861 "write": true, 00:11:16.861 "unmap": true, 00:11:16.861 "flush": true, 00:11:16.861 "reset": true, 00:11:16.861 "nvme_admin": false, 00:11:16.861 "nvme_io": false, 00:11:16.861 "nvme_io_md": false, 00:11:16.861 "write_zeroes": true, 00:11:16.861 "zcopy": true, 00:11:16.861 "get_zone_info": false, 00:11:16.861 "zone_management": false, 00:11:16.861 "zone_append": false, 00:11:16.861 "compare": false, 00:11:16.861 "compare_and_write": false, 00:11:16.861 "abort": true, 00:11:16.861 "seek_hole": false, 00:11:16.861 "seek_data": false, 00:11:16.861 "copy": true, 00:11:16.861 "nvme_iov_md": false 00:11:16.861 }, 00:11:16.861 "memory_domains": [ 00:11:16.861 { 00:11:16.861 "dma_device_id": "system", 00:11:16.861 "dma_device_type": 1 00:11:16.861 }, 00:11:16.861 { 00:11:16.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.861 "dma_device_type": 2 00:11:16.861 } 00:11:16.861 ], 00:11:16.861 "driver_specific": {} 00:11:16.861 } 00:11:16.861 ] 00:11:16.861 13:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:16.861 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:16.861 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.862 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.121 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.121 "name": "Existed_Raid", 00:11:17.121 "uuid": "3a678dc3-3288-438d-a065-2f3f6e8b4fb0", 00:11:17.121 "strip_size_kb": 64, 00:11:17.121 "state": "configuring", 00:11:17.121 "raid_level": "concat", 00:11:17.121 "superblock": true, 00:11:17.121 "num_base_bdevs": 2, 00:11:17.121 "num_base_bdevs_discovered": 1, 00:11:17.121 "num_base_bdevs_operational": 2, 00:11:17.121 "base_bdevs_list": [ 00:11:17.121 { 00:11:17.121 "name": "BaseBdev1", 00:11:17.121 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:17.121 "is_configured": true, 00:11:17.121 "data_offset": 2048, 00:11:17.121 "data_size": 63488 00:11:17.121 }, 00:11:17.121 { 00:11:17.121 "name": "BaseBdev2", 00:11:17.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.121 "is_configured": false, 00:11:17.121 "data_offset": 0, 00:11:17.121 "data_size": 0 00:11:17.121 } 00:11:17.121 ] 00:11:17.121 }' 00:11:17.121 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.121 13:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:17.689 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:17.948 [2024-07-15 13:29:57.226749] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:17.948 [2024-07-15 13:29:57.226786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e4c350 name Existed_Raid, state configuring 00:11:17.948 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:18.208 [2024-07-15 13:29:57.471428] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.208 [2024-07-15 13:29:57.472909] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.208 [2024-07-15 13:29:57.472949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.208 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.467 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.467 "name": "Existed_Raid", 00:11:18.467 "uuid": "abebc25f-e341-4d73-b75a-52356199b983", 00:11:18.467 "strip_size_kb": 64, 00:11:18.467 "state": "configuring", 00:11:18.467 "raid_level": "concat", 00:11:18.467 "superblock": true, 00:11:18.467 "num_base_bdevs": 2, 00:11:18.467 "num_base_bdevs_discovered": 1, 00:11:18.467 "num_base_bdevs_operational": 2, 00:11:18.467 "base_bdevs_list": [ 00:11:18.467 { 00:11:18.467 "name": "BaseBdev1", 00:11:18.467 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:18.467 "is_configured": true, 00:11:18.467 "data_offset": 2048, 00:11:18.467 "data_size": 63488 00:11:18.467 }, 00:11:18.467 { 00:11:18.467 "name": "BaseBdev2", 00:11:18.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.467 "is_configured": false, 00:11:18.467 "data_offset": 0, 00:11:18.467 "data_size": 0 00:11:18.467 } 00:11:18.467 ] 00:11:18.467 }' 00:11:18.467 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.467 13:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.035 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:19.295 [2024-07-15 13:29:58.577801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:19.295 [2024-07-15 13:29:58.577956] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e4d000 00:11:19.295 [2024-07-15 13:29:58.577971] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:19.295 [2024-07-15 13:29:58.578144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d670c0 00:11:19.295 [2024-07-15 13:29:58.578256] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e4d000 00:11:19.295 [2024-07-15 13:29:58.578267] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e4d000 00:11:19.295 [2024-07-15 13:29:58.578355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:19.295 BaseBdev2 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:19.295 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:19.554 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:19.813 [ 00:11:19.813 { 00:11:19.813 "name": "BaseBdev2", 00:11:19.813 "aliases": [ 00:11:19.813 "deca9820-b080-431c-a3a5-3af2b0713fa3" 00:11:19.813 ], 00:11:19.813 "product_name": "Malloc disk", 00:11:19.813 "block_size": 512, 00:11:19.813 "num_blocks": 65536, 00:11:19.813 "uuid": "deca9820-b080-431c-a3a5-3af2b0713fa3", 00:11:19.813 "assigned_rate_limits": { 00:11:19.813 "rw_ios_per_sec": 0, 00:11:19.813 "rw_mbytes_per_sec": 0, 00:11:19.813 "r_mbytes_per_sec": 0, 00:11:19.813 "w_mbytes_per_sec": 0 00:11:19.813 }, 00:11:19.813 "claimed": true, 00:11:19.813 "claim_type": "exclusive_write", 00:11:19.813 "zoned": false, 00:11:19.813 "supported_io_types": { 00:11:19.813 "read": true, 00:11:19.813 "write": true, 00:11:19.813 "unmap": true, 00:11:19.813 "flush": true, 00:11:19.813 "reset": true, 00:11:19.813 "nvme_admin": false, 00:11:19.813 "nvme_io": false, 00:11:19.813 "nvme_io_md": false, 00:11:19.813 "write_zeroes": true, 00:11:19.813 "zcopy": true, 00:11:19.813 "get_zone_info": false, 00:11:19.813 "zone_management": false, 00:11:19.813 "zone_append": false, 00:11:19.813 "compare": false, 00:11:19.813 "compare_and_write": false, 00:11:19.813 "abort": true, 00:11:19.813 "seek_hole": false, 00:11:19.813 "seek_data": false, 00:11:19.813 "copy": true, 00:11:19.813 "nvme_iov_md": false 00:11:19.813 }, 00:11:19.813 "memory_domains": [ 00:11:19.813 { 00:11:19.813 "dma_device_id": "system", 00:11:19.813 "dma_device_type": 1 00:11:19.813 }, 00:11:19.813 { 00:11:19.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.813 "dma_device_type": 2 00:11:19.813 } 00:11:19.813 ], 00:11:19.813 "driver_specific": {} 00:11:19.813 } 00:11:19.813 ] 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.813 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.072 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.072 "name": "Existed_Raid", 00:11:20.072 "uuid": "abebc25f-e341-4d73-b75a-52356199b983", 00:11:20.072 "strip_size_kb": 64, 00:11:20.072 "state": "online", 00:11:20.072 "raid_level": "concat", 00:11:20.072 "superblock": true, 00:11:20.072 "num_base_bdevs": 2, 00:11:20.072 "num_base_bdevs_discovered": 2, 00:11:20.072 "num_base_bdevs_operational": 2, 00:11:20.072 "base_bdevs_list": [ 00:11:20.072 { 00:11:20.072 "name": "BaseBdev1", 00:11:20.072 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:20.072 "is_configured": true, 00:11:20.072 "data_offset": 2048, 00:11:20.072 "data_size": 63488 00:11:20.072 }, 00:11:20.072 { 00:11:20.072 "name": "BaseBdev2", 00:11:20.072 "uuid": "deca9820-b080-431c-a3a5-3af2b0713fa3", 00:11:20.072 "is_configured": true, 00:11:20.072 "data_offset": 2048, 00:11:20.072 "data_size": 63488 00:11:20.072 } 00:11:20.072 ] 00:11:20.072 }' 00:11:20.072 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.072 13:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:20.640 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:20.640 [2024-07-15 13:30:00.033948] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:20.640 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:20.640 "name": "Existed_Raid", 00:11:20.640 "aliases": [ 00:11:20.640 "abebc25f-e341-4d73-b75a-52356199b983" 00:11:20.640 ], 00:11:20.640 "product_name": "Raid Volume", 00:11:20.640 "block_size": 512, 00:11:20.640 "num_blocks": 126976, 00:11:20.640 "uuid": "abebc25f-e341-4d73-b75a-52356199b983", 00:11:20.640 "assigned_rate_limits": { 00:11:20.640 "rw_ios_per_sec": 0, 00:11:20.640 "rw_mbytes_per_sec": 0, 00:11:20.640 "r_mbytes_per_sec": 0, 00:11:20.640 "w_mbytes_per_sec": 0 00:11:20.640 }, 00:11:20.640 "claimed": false, 00:11:20.640 "zoned": false, 00:11:20.640 "supported_io_types": { 00:11:20.640 "read": true, 00:11:20.640 "write": true, 00:11:20.640 "unmap": true, 00:11:20.640 "flush": true, 00:11:20.640 "reset": true, 00:11:20.640 "nvme_admin": false, 00:11:20.640 "nvme_io": false, 00:11:20.640 "nvme_io_md": false, 00:11:20.640 "write_zeroes": true, 00:11:20.640 "zcopy": false, 00:11:20.640 "get_zone_info": false, 00:11:20.640 "zone_management": false, 00:11:20.640 "zone_append": false, 00:11:20.640 "compare": false, 00:11:20.640 "compare_and_write": false, 00:11:20.640 "abort": false, 00:11:20.640 "seek_hole": false, 00:11:20.640 "seek_data": false, 00:11:20.640 "copy": false, 00:11:20.640 "nvme_iov_md": false 00:11:20.640 }, 00:11:20.640 "memory_domains": [ 00:11:20.640 { 00:11:20.640 "dma_device_id": "system", 00:11:20.640 "dma_device_type": 1 00:11:20.640 }, 00:11:20.640 { 00:11:20.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.640 "dma_device_type": 2 00:11:20.640 }, 00:11:20.640 { 00:11:20.640 "dma_device_id": "system", 00:11:20.640 "dma_device_type": 1 00:11:20.640 }, 00:11:20.640 { 00:11:20.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.640 "dma_device_type": 2 00:11:20.640 } 00:11:20.640 ], 00:11:20.640 "driver_specific": { 00:11:20.640 "raid": { 00:11:20.640 "uuid": "abebc25f-e341-4d73-b75a-52356199b983", 00:11:20.641 "strip_size_kb": 64, 00:11:20.641 "state": "online", 00:11:20.641 "raid_level": "concat", 00:11:20.641 "superblock": true, 00:11:20.641 "num_base_bdevs": 2, 00:11:20.641 "num_base_bdevs_discovered": 2, 00:11:20.641 "num_base_bdevs_operational": 2, 00:11:20.641 "base_bdevs_list": [ 00:11:20.641 { 00:11:20.641 "name": "BaseBdev1", 00:11:20.641 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:20.641 "is_configured": true, 00:11:20.641 "data_offset": 2048, 00:11:20.641 "data_size": 63488 00:11:20.641 }, 00:11:20.641 { 00:11:20.641 "name": "BaseBdev2", 00:11:20.641 "uuid": "deca9820-b080-431c-a3a5-3af2b0713fa3", 00:11:20.641 "is_configured": true, 00:11:20.641 "data_offset": 2048, 00:11:20.641 "data_size": 63488 00:11:20.641 } 00:11:20.641 ] 00:11:20.641 } 00:11:20.641 } 00:11:20.641 }' 00:11:20.641 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:20.899 BaseBdev2' 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.899 "name": "BaseBdev1", 00:11:20.899 "aliases": [ 00:11:20.899 "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9" 00:11:20.899 ], 00:11:20.899 "product_name": "Malloc disk", 00:11:20.899 "block_size": 512, 00:11:20.899 "num_blocks": 65536, 00:11:20.899 "uuid": "87cfc2a7-fcff-4b7d-b22c-d9e4ecc9dab9", 00:11:20.899 "assigned_rate_limits": { 00:11:20.899 "rw_ios_per_sec": 0, 00:11:20.899 "rw_mbytes_per_sec": 0, 00:11:20.899 "r_mbytes_per_sec": 0, 00:11:20.899 "w_mbytes_per_sec": 0 00:11:20.899 }, 00:11:20.899 "claimed": true, 00:11:20.899 "claim_type": "exclusive_write", 00:11:20.899 "zoned": false, 00:11:20.899 "supported_io_types": { 00:11:20.899 "read": true, 00:11:20.899 "write": true, 00:11:20.899 "unmap": true, 00:11:20.899 "flush": true, 00:11:20.899 "reset": true, 00:11:20.899 "nvme_admin": false, 00:11:20.899 "nvme_io": false, 00:11:20.899 "nvme_io_md": false, 00:11:20.899 "write_zeroes": true, 00:11:20.899 "zcopy": true, 00:11:20.899 "get_zone_info": false, 00:11:20.899 "zone_management": false, 00:11:20.899 "zone_append": false, 00:11:20.899 "compare": false, 00:11:20.899 "compare_and_write": false, 00:11:20.899 "abort": true, 00:11:20.899 "seek_hole": false, 00:11:20.899 "seek_data": false, 00:11:20.899 "copy": true, 00:11:20.899 "nvme_iov_md": false 00:11:20.899 }, 00:11:20.899 "memory_domains": [ 00:11:20.899 { 00:11:20.899 "dma_device_id": "system", 00:11:20.899 "dma_device_type": 1 00:11:20.899 }, 00:11:20.899 { 00:11:20.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.899 "dma_device_type": 2 00:11:20.899 } 00:11:20.899 ], 00:11:20.899 "driver_specific": {} 00:11:20.899 }' 00:11:20.899 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.158 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.418 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.418 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.418 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:21.418 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:21.418 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:21.676 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:21.676 "name": "BaseBdev2", 00:11:21.676 "aliases": [ 00:11:21.676 "deca9820-b080-431c-a3a5-3af2b0713fa3" 00:11:21.676 ], 00:11:21.676 "product_name": "Malloc disk", 00:11:21.676 "block_size": 512, 00:11:21.676 "num_blocks": 65536, 00:11:21.676 "uuid": "deca9820-b080-431c-a3a5-3af2b0713fa3", 00:11:21.676 "assigned_rate_limits": { 00:11:21.676 "rw_ios_per_sec": 0, 00:11:21.676 "rw_mbytes_per_sec": 0, 00:11:21.676 "r_mbytes_per_sec": 0, 00:11:21.676 "w_mbytes_per_sec": 0 00:11:21.676 }, 00:11:21.676 "claimed": true, 00:11:21.676 "claim_type": "exclusive_write", 00:11:21.676 "zoned": false, 00:11:21.676 "supported_io_types": { 00:11:21.676 "read": true, 00:11:21.676 "write": true, 00:11:21.676 "unmap": true, 00:11:21.676 "flush": true, 00:11:21.676 "reset": true, 00:11:21.676 "nvme_admin": false, 00:11:21.676 "nvme_io": false, 00:11:21.676 "nvme_io_md": false, 00:11:21.676 "write_zeroes": true, 00:11:21.676 "zcopy": true, 00:11:21.676 "get_zone_info": false, 00:11:21.676 "zone_management": false, 00:11:21.676 "zone_append": false, 00:11:21.676 "compare": false, 00:11:21.676 "compare_and_write": false, 00:11:21.676 "abort": true, 00:11:21.676 "seek_hole": false, 00:11:21.676 "seek_data": false, 00:11:21.676 "copy": true, 00:11:21.676 "nvme_iov_md": false 00:11:21.676 }, 00:11:21.676 "memory_domains": [ 00:11:21.676 { 00:11:21.676 "dma_device_id": "system", 00:11:21.676 "dma_device_type": 1 00:11:21.676 }, 00:11:21.676 { 00:11:21.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.676 "dma_device_type": 2 00:11:21.676 } 00:11:21.676 ], 00:11:21.676 "driver_specific": {} 00:11:21.676 }' 00:11:21.676 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.676 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.676 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.676 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.676 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.676 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.676 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.935 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:22.194 [2024-07-15 13:30:01.457647] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:22.194 [2024-07-15 13:30:01.457677] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:22.194 [2024-07-15 13:30:01.457718] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.194 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.462 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.462 "name": "Existed_Raid", 00:11:22.462 "uuid": "abebc25f-e341-4d73-b75a-52356199b983", 00:11:22.462 "strip_size_kb": 64, 00:11:22.462 "state": "offline", 00:11:22.462 "raid_level": "concat", 00:11:22.462 "superblock": true, 00:11:22.462 "num_base_bdevs": 2, 00:11:22.462 "num_base_bdevs_discovered": 1, 00:11:22.462 "num_base_bdevs_operational": 1, 00:11:22.462 "base_bdevs_list": [ 00:11:22.462 { 00:11:22.462 "name": null, 00:11:22.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.462 "is_configured": false, 00:11:22.462 "data_offset": 2048, 00:11:22.462 "data_size": 63488 00:11:22.462 }, 00:11:22.462 { 00:11:22.462 "name": "BaseBdev2", 00:11:22.462 "uuid": "deca9820-b080-431c-a3a5-3af2b0713fa3", 00:11:22.462 "is_configured": true, 00:11:22.462 "data_offset": 2048, 00:11:22.462 "data_size": 63488 00:11:22.462 } 00:11:22.462 ] 00:11:22.462 }' 00:11:22.462 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.462 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.028 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:23.028 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.028 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.028 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:23.285 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:23.285 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:23.285 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:23.544 [2024-07-15 13:30:02.774504] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:23.544 [2024-07-15 13:30:02.774553] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e4d000 name Existed_Raid, state offline 00:11:23.544 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:23.544 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.544 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.544 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2074405 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2074405 ']' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2074405 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2074405 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2074405' 00:11:23.802 killing process with pid 2074405 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2074405 00:11:23.802 [2024-07-15 13:30:03.104208] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:23.802 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2074405 00:11:23.802 [2024-07-15 13:30:03.105162] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:24.059 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:24.059 00:11:24.059 real 0m10.489s 00:11:24.059 user 0m18.589s 00:11:24.059 sys 0m1.992s 00:11:24.059 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:24.059 13:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.059 ************************************ 00:11:24.059 END TEST raid_state_function_test_sb 00:11:24.059 ************************************ 00:11:24.059 13:30:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:24.059 13:30:03 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:24.059 13:30:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:24.059 13:30:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:24.059 13:30:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:24.059 ************************************ 00:11:24.059 START TEST raid_superblock_test 00:11:24.059 ************************************ 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2076165 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2076165 /var/tmp/spdk-raid.sock 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2076165 ']' 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:24.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:24.059 13:30:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.059 [2024-07-15 13:30:03.482422] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:24.059 [2024-07-15 13:30:03.482492] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2076165 ] 00:11:24.331 [2024-07-15 13:30:03.613089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:24.331 [2024-07-15 13:30:03.710704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:24.589 [2024-07-15 13:30:03.778646] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.589 [2024-07-15 13:30:03.778682] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:25.155 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:25.414 malloc1 00:11:25.414 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:25.673 [2024-07-15 13:30:04.903014] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:25.673 [2024-07-15 13:30:04.903064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:25.673 [2024-07-15 13:30:04.903084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc2570 00:11:25.673 [2024-07-15 13:30:04.903096] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:25.673 [2024-07-15 13:30:04.904625] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:25.673 [2024-07-15 13:30:04.904655] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:25.673 pt1 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:25.673 13:30:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:25.932 malloc2 00:11:25.932 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:26.191 [2024-07-15 13:30:05.405131] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:26.191 [2024-07-15 13:30:05.405181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.191 [2024-07-15 13:30:05.405199] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc3970 00:11:26.191 [2024-07-15 13:30:05.405212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.191 [2024-07-15 13:30:05.406829] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.191 [2024-07-15 13:30:05.406860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:26.191 pt2 00:11:26.191 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:26.191 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.191 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:26.451 [2024-07-15 13:30:05.649801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:26.451 [2024-07-15 13:30:05.651094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:26.451 [2024-07-15 13:30:05.651240] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1166270 00:11:26.451 [2024-07-15 13:30:05.651254] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:26.451 [2024-07-15 13:30:05.651453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x115bc10 00:11:26.451 [2024-07-15 13:30:05.651600] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1166270 00:11:26.451 [2024-07-15 13:30:05.651610] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1166270 00:11:26.451 [2024-07-15 13:30:05.651708] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.451 "name": "raid_bdev1", 00:11:26.451 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:26.451 "strip_size_kb": 64, 00:11:26.451 "state": "online", 00:11:26.451 "raid_level": "concat", 00:11:26.451 "superblock": true, 00:11:26.451 "num_base_bdevs": 2, 00:11:26.451 "num_base_bdevs_discovered": 2, 00:11:26.451 "num_base_bdevs_operational": 2, 00:11:26.451 "base_bdevs_list": [ 00:11:26.451 { 00:11:26.451 "name": "pt1", 00:11:26.451 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:26.451 "is_configured": true, 00:11:26.451 "data_offset": 2048, 00:11:26.451 "data_size": 63488 00:11:26.451 }, 00:11:26.451 { 00:11:26.451 "name": "pt2", 00:11:26.451 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:26.451 "is_configured": true, 00:11:26.451 "data_offset": 2048, 00:11:26.451 "data_size": 63488 00:11:26.451 } 00:11:26.451 ] 00:11:26.451 }' 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.451 13:30:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.388 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:27.389 [2024-07-15 13:30:06.672722] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:27.389 "name": "raid_bdev1", 00:11:27.389 "aliases": [ 00:11:27.389 "37a5944d-688e-479f-a23f-1309ec4b7a88" 00:11:27.389 ], 00:11:27.389 "product_name": "Raid Volume", 00:11:27.389 "block_size": 512, 00:11:27.389 "num_blocks": 126976, 00:11:27.389 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:27.389 "assigned_rate_limits": { 00:11:27.389 "rw_ios_per_sec": 0, 00:11:27.389 "rw_mbytes_per_sec": 0, 00:11:27.389 "r_mbytes_per_sec": 0, 00:11:27.389 "w_mbytes_per_sec": 0 00:11:27.389 }, 00:11:27.389 "claimed": false, 00:11:27.389 "zoned": false, 00:11:27.389 "supported_io_types": { 00:11:27.389 "read": true, 00:11:27.389 "write": true, 00:11:27.389 "unmap": true, 00:11:27.389 "flush": true, 00:11:27.389 "reset": true, 00:11:27.389 "nvme_admin": false, 00:11:27.389 "nvme_io": false, 00:11:27.389 "nvme_io_md": false, 00:11:27.389 "write_zeroes": true, 00:11:27.389 "zcopy": false, 00:11:27.389 "get_zone_info": false, 00:11:27.389 "zone_management": false, 00:11:27.389 "zone_append": false, 00:11:27.389 "compare": false, 00:11:27.389 "compare_and_write": false, 00:11:27.389 "abort": false, 00:11:27.389 "seek_hole": false, 00:11:27.389 "seek_data": false, 00:11:27.389 "copy": false, 00:11:27.389 "nvme_iov_md": false 00:11:27.389 }, 00:11:27.389 "memory_domains": [ 00:11:27.389 { 00:11:27.389 "dma_device_id": "system", 00:11:27.389 "dma_device_type": 1 00:11:27.389 }, 00:11:27.389 { 00:11:27.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.389 "dma_device_type": 2 00:11:27.389 }, 00:11:27.389 { 00:11:27.389 "dma_device_id": "system", 00:11:27.389 "dma_device_type": 1 00:11:27.389 }, 00:11:27.389 { 00:11:27.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.389 "dma_device_type": 2 00:11:27.389 } 00:11:27.389 ], 00:11:27.389 "driver_specific": { 00:11:27.389 "raid": { 00:11:27.389 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:27.389 "strip_size_kb": 64, 00:11:27.389 "state": "online", 00:11:27.389 "raid_level": "concat", 00:11:27.389 "superblock": true, 00:11:27.389 "num_base_bdevs": 2, 00:11:27.389 "num_base_bdevs_discovered": 2, 00:11:27.389 "num_base_bdevs_operational": 2, 00:11:27.389 "base_bdevs_list": [ 00:11:27.389 { 00:11:27.389 "name": "pt1", 00:11:27.389 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.389 "is_configured": true, 00:11:27.389 "data_offset": 2048, 00:11:27.389 "data_size": 63488 00:11:27.389 }, 00:11:27.389 { 00:11:27.389 "name": "pt2", 00:11:27.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:27.389 "is_configured": true, 00:11:27.389 "data_offset": 2048, 00:11:27.389 "data_size": 63488 00:11:27.389 } 00:11:27.389 ] 00:11:27.389 } 00:11:27.389 } 00:11:27.389 }' 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:27.389 pt2' 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:27.389 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.648 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.648 "name": "pt1", 00:11:27.648 "aliases": [ 00:11:27.648 "00000000-0000-0000-0000-000000000001" 00:11:27.648 ], 00:11:27.648 "product_name": "passthru", 00:11:27.648 "block_size": 512, 00:11:27.648 "num_blocks": 65536, 00:11:27.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.649 "assigned_rate_limits": { 00:11:27.649 "rw_ios_per_sec": 0, 00:11:27.649 "rw_mbytes_per_sec": 0, 00:11:27.649 "r_mbytes_per_sec": 0, 00:11:27.649 "w_mbytes_per_sec": 0 00:11:27.649 }, 00:11:27.649 "claimed": true, 00:11:27.649 "claim_type": "exclusive_write", 00:11:27.649 "zoned": false, 00:11:27.649 "supported_io_types": { 00:11:27.649 "read": true, 00:11:27.649 "write": true, 00:11:27.649 "unmap": true, 00:11:27.649 "flush": true, 00:11:27.649 "reset": true, 00:11:27.649 "nvme_admin": false, 00:11:27.649 "nvme_io": false, 00:11:27.649 "nvme_io_md": false, 00:11:27.649 "write_zeroes": true, 00:11:27.649 "zcopy": true, 00:11:27.649 "get_zone_info": false, 00:11:27.649 "zone_management": false, 00:11:27.649 "zone_append": false, 00:11:27.649 "compare": false, 00:11:27.649 "compare_and_write": false, 00:11:27.649 "abort": true, 00:11:27.649 "seek_hole": false, 00:11:27.649 "seek_data": false, 00:11:27.649 "copy": true, 00:11:27.649 "nvme_iov_md": false 00:11:27.649 }, 00:11:27.649 "memory_domains": [ 00:11:27.649 { 00:11:27.649 "dma_device_id": "system", 00:11:27.649 "dma_device_type": 1 00:11:27.649 }, 00:11:27.649 { 00:11:27.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.649 "dma_device_type": 2 00:11:27.649 } 00:11:27.649 ], 00:11:27.649 "driver_specific": { 00:11:27.649 "passthru": { 00:11:27.649 "name": "pt1", 00:11:27.649 "base_bdev_name": "malloc1" 00:11:27.649 } 00:11:27.649 } 00:11:27.649 }' 00:11:27.649 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.649 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:27.908 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.167 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.167 "name": "pt2", 00:11:28.167 "aliases": [ 00:11:28.167 "00000000-0000-0000-0000-000000000002" 00:11:28.167 ], 00:11:28.167 "product_name": "passthru", 00:11:28.167 "block_size": 512, 00:11:28.167 "num_blocks": 65536, 00:11:28.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:28.167 "assigned_rate_limits": { 00:11:28.167 "rw_ios_per_sec": 0, 00:11:28.167 "rw_mbytes_per_sec": 0, 00:11:28.167 "r_mbytes_per_sec": 0, 00:11:28.167 "w_mbytes_per_sec": 0 00:11:28.167 }, 00:11:28.167 "claimed": true, 00:11:28.167 "claim_type": "exclusive_write", 00:11:28.167 "zoned": false, 00:11:28.167 "supported_io_types": { 00:11:28.167 "read": true, 00:11:28.167 "write": true, 00:11:28.167 "unmap": true, 00:11:28.167 "flush": true, 00:11:28.167 "reset": true, 00:11:28.167 "nvme_admin": false, 00:11:28.167 "nvme_io": false, 00:11:28.167 "nvme_io_md": false, 00:11:28.167 "write_zeroes": true, 00:11:28.167 "zcopy": true, 00:11:28.167 "get_zone_info": false, 00:11:28.167 "zone_management": false, 00:11:28.167 "zone_append": false, 00:11:28.167 "compare": false, 00:11:28.167 "compare_and_write": false, 00:11:28.167 "abort": true, 00:11:28.167 "seek_hole": false, 00:11:28.167 "seek_data": false, 00:11:28.167 "copy": true, 00:11:28.167 "nvme_iov_md": false 00:11:28.167 }, 00:11:28.167 "memory_domains": [ 00:11:28.167 { 00:11:28.167 "dma_device_id": "system", 00:11:28.168 "dma_device_type": 1 00:11:28.168 }, 00:11:28.168 { 00:11:28.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.168 "dma_device_type": 2 00:11:28.168 } 00:11:28.168 ], 00:11:28.168 "driver_specific": { 00:11:28.168 "passthru": { 00:11:28.168 "name": "pt2", 00:11:28.168 "base_bdev_name": "malloc2" 00:11:28.168 } 00:11:28.168 } 00:11:28.168 }' 00:11:28.168 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.168 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.168 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.168 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:28.426 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:28.685 [2024-07-15 13:30:08.036345] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.685 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=37a5944d-688e-479f-a23f-1309ec4b7a88 00:11:28.685 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 37a5944d-688e-479f-a23f-1309ec4b7a88 ']' 00:11:28.685 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:28.945 [2024-07-15 13:30:08.284750] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:28.945 [2024-07-15 13:30:08.284773] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:28.945 [2024-07-15 13:30:08.284831] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:28.945 [2024-07-15 13:30:08.284880] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:28.945 [2024-07-15 13:30:08.284892] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1166270 name raid_bdev1, state offline 00:11:28.945 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.945 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:29.204 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:29.204 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:29.204 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:29.204 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:29.794 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:29.794 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:30.054 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:30.054 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:30.321 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.631 [2024-07-15 13:30:09.752580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:30.631 [2024-07-15 13:30:09.753985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:30.631 [2024-07-15 13:30:09.754056] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:30.631 [2024-07-15 13:30:09.754100] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:30.631 [2024-07-15 13:30:09.754120] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:30.631 [2024-07-15 13:30:09.754132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1165ff0 name raid_bdev1, state configuring 00:11:30.631 request: 00:11:30.631 { 00:11:30.631 "name": "raid_bdev1", 00:11:30.631 "raid_level": "concat", 00:11:30.631 "base_bdevs": [ 00:11:30.631 "malloc1", 00:11:30.631 "malloc2" 00:11:30.631 ], 00:11:30.631 "strip_size_kb": 64, 00:11:30.631 "superblock": false, 00:11:30.631 "method": "bdev_raid_create", 00:11:30.631 "req_id": 1 00:11:30.631 } 00:11:30.631 Got JSON-RPC error response 00:11:30.631 response: 00:11:30.631 { 00:11:30.631 "code": -17, 00:11:30.631 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:30.631 } 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.631 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:30.631 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:30.631 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:30.631 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:30.890 [2024-07-15 13:30:10.233789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:30.890 [2024-07-15 13:30:10.233839] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.890 [2024-07-15 13:30:10.233862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc27a0 00:11:30.890 [2024-07-15 13:30:10.233875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.890 [2024-07-15 13:30:10.235481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.890 [2024-07-15 13:30:10.235512] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:30.890 [2024-07-15 13:30:10.235583] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:30.890 [2024-07-15 13:30:10.235611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:30.890 pt1 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.890 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.149 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.149 "name": "raid_bdev1", 00:11:31.149 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:31.149 "strip_size_kb": 64, 00:11:31.149 "state": "configuring", 00:11:31.149 "raid_level": "concat", 00:11:31.149 "superblock": true, 00:11:31.149 "num_base_bdevs": 2, 00:11:31.149 "num_base_bdevs_discovered": 1, 00:11:31.149 "num_base_bdevs_operational": 2, 00:11:31.149 "base_bdevs_list": [ 00:11:31.149 { 00:11:31.149 "name": "pt1", 00:11:31.149 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:31.149 "is_configured": true, 00:11:31.149 "data_offset": 2048, 00:11:31.149 "data_size": 63488 00:11:31.149 }, 00:11:31.149 { 00:11:31.149 "name": null, 00:11:31.149 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:31.149 "is_configured": false, 00:11:31.149 "data_offset": 2048, 00:11:31.149 "data_size": 63488 00:11:31.149 } 00:11:31.149 ] 00:11:31.149 }' 00:11:31.149 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.149 13:30:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.714 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:31.714 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:31.714 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:31.714 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:31.972 [2024-07-15 13:30:11.260506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:31.972 [2024-07-15 13:30:11.260556] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:31.972 [2024-07-15 13:30:11.260574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x115c820 00:11:31.972 [2024-07-15 13:30:11.260586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:31.972 [2024-07-15 13:30:11.260949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:31.972 [2024-07-15 13:30:11.260970] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:31.972 [2024-07-15 13:30:11.261033] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:31.972 [2024-07-15 13:30:11.261053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:31.972 [2024-07-15 13:30:11.261150] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb8ec0 00:11:31.972 [2024-07-15 13:30:11.261161] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:31.972 [2024-07-15 13:30:11.261333] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb9f00 00:11:31.972 [2024-07-15 13:30:11.261458] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb8ec0 00:11:31.972 [2024-07-15 13:30:11.261469] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfb8ec0 00:11:31.972 [2024-07-15 13:30:11.261566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.972 pt2 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.972 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:32.230 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.230 "name": "raid_bdev1", 00:11:32.230 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:32.230 "strip_size_kb": 64, 00:11:32.230 "state": "online", 00:11:32.230 "raid_level": "concat", 00:11:32.230 "superblock": true, 00:11:32.230 "num_base_bdevs": 2, 00:11:32.230 "num_base_bdevs_discovered": 2, 00:11:32.230 "num_base_bdevs_operational": 2, 00:11:32.230 "base_bdevs_list": [ 00:11:32.230 { 00:11:32.230 "name": "pt1", 00:11:32.230 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.230 "is_configured": true, 00:11:32.230 "data_offset": 2048, 00:11:32.230 "data_size": 63488 00:11:32.230 }, 00:11:32.230 { 00:11:32.230 "name": "pt2", 00:11:32.230 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.230 "is_configured": true, 00:11:32.230 "data_offset": 2048, 00:11:32.230 "data_size": 63488 00:11:32.230 } 00:11:32.230 ] 00:11:32.230 }' 00:11:32.230 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.230 13:30:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:32.797 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:33.055 [2024-07-15 13:30:12.343715] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:33.055 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:33.056 "name": "raid_bdev1", 00:11:33.056 "aliases": [ 00:11:33.056 "37a5944d-688e-479f-a23f-1309ec4b7a88" 00:11:33.056 ], 00:11:33.056 "product_name": "Raid Volume", 00:11:33.056 "block_size": 512, 00:11:33.056 "num_blocks": 126976, 00:11:33.056 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:33.056 "assigned_rate_limits": { 00:11:33.056 "rw_ios_per_sec": 0, 00:11:33.056 "rw_mbytes_per_sec": 0, 00:11:33.056 "r_mbytes_per_sec": 0, 00:11:33.056 "w_mbytes_per_sec": 0 00:11:33.056 }, 00:11:33.056 "claimed": false, 00:11:33.056 "zoned": false, 00:11:33.056 "supported_io_types": { 00:11:33.056 "read": true, 00:11:33.056 "write": true, 00:11:33.056 "unmap": true, 00:11:33.056 "flush": true, 00:11:33.056 "reset": true, 00:11:33.056 "nvme_admin": false, 00:11:33.056 "nvme_io": false, 00:11:33.056 "nvme_io_md": false, 00:11:33.056 "write_zeroes": true, 00:11:33.056 "zcopy": false, 00:11:33.056 "get_zone_info": false, 00:11:33.056 "zone_management": false, 00:11:33.056 "zone_append": false, 00:11:33.056 "compare": false, 00:11:33.056 "compare_and_write": false, 00:11:33.056 "abort": false, 00:11:33.056 "seek_hole": false, 00:11:33.056 "seek_data": false, 00:11:33.056 "copy": false, 00:11:33.056 "nvme_iov_md": false 00:11:33.056 }, 00:11:33.056 "memory_domains": [ 00:11:33.056 { 00:11:33.056 "dma_device_id": "system", 00:11:33.056 "dma_device_type": 1 00:11:33.056 }, 00:11:33.056 { 00:11:33.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.056 "dma_device_type": 2 00:11:33.056 }, 00:11:33.056 { 00:11:33.056 "dma_device_id": "system", 00:11:33.056 "dma_device_type": 1 00:11:33.056 }, 00:11:33.056 { 00:11:33.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.056 "dma_device_type": 2 00:11:33.056 } 00:11:33.056 ], 00:11:33.056 "driver_specific": { 00:11:33.056 "raid": { 00:11:33.056 "uuid": "37a5944d-688e-479f-a23f-1309ec4b7a88", 00:11:33.056 "strip_size_kb": 64, 00:11:33.056 "state": "online", 00:11:33.056 "raid_level": "concat", 00:11:33.056 "superblock": true, 00:11:33.056 "num_base_bdevs": 2, 00:11:33.056 "num_base_bdevs_discovered": 2, 00:11:33.056 "num_base_bdevs_operational": 2, 00:11:33.056 "base_bdevs_list": [ 00:11:33.056 { 00:11:33.056 "name": "pt1", 00:11:33.056 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:33.056 "is_configured": true, 00:11:33.056 "data_offset": 2048, 00:11:33.056 "data_size": 63488 00:11:33.056 }, 00:11:33.056 { 00:11:33.056 "name": "pt2", 00:11:33.056 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:33.056 "is_configured": true, 00:11:33.056 "data_offset": 2048, 00:11:33.056 "data_size": 63488 00:11:33.056 } 00:11:33.056 ] 00:11:33.056 } 00:11:33.056 } 00:11:33.056 }' 00:11:33.056 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:33.056 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:33.056 pt2' 00:11:33.056 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.056 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:33.056 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:33.315 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:33.315 "name": "pt1", 00:11:33.315 "aliases": [ 00:11:33.315 "00000000-0000-0000-0000-000000000001" 00:11:33.315 ], 00:11:33.315 "product_name": "passthru", 00:11:33.315 "block_size": 512, 00:11:33.315 "num_blocks": 65536, 00:11:33.315 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:33.315 "assigned_rate_limits": { 00:11:33.315 "rw_ios_per_sec": 0, 00:11:33.315 "rw_mbytes_per_sec": 0, 00:11:33.315 "r_mbytes_per_sec": 0, 00:11:33.315 "w_mbytes_per_sec": 0 00:11:33.315 }, 00:11:33.315 "claimed": true, 00:11:33.315 "claim_type": "exclusive_write", 00:11:33.315 "zoned": false, 00:11:33.315 "supported_io_types": { 00:11:33.315 "read": true, 00:11:33.315 "write": true, 00:11:33.315 "unmap": true, 00:11:33.315 "flush": true, 00:11:33.315 "reset": true, 00:11:33.315 "nvme_admin": false, 00:11:33.315 "nvme_io": false, 00:11:33.315 "nvme_io_md": false, 00:11:33.315 "write_zeroes": true, 00:11:33.315 "zcopy": true, 00:11:33.315 "get_zone_info": false, 00:11:33.315 "zone_management": false, 00:11:33.315 "zone_append": false, 00:11:33.315 "compare": false, 00:11:33.315 "compare_and_write": false, 00:11:33.315 "abort": true, 00:11:33.315 "seek_hole": false, 00:11:33.315 "seek_data": false, 00:11:33.315 "copy": true, 00:11:33.315 "nvme_iov_md": false 00:11:33.315 }, 00:11:33.315 "memory_domains": [ 00:11:33.315 { 00:11:33.315 "dma_device_id": "system", 00:11:33.315 "dma_device_type": 1 00:11:33.315 }, 00:11:33.315 { 00:11:33.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.315 "dma_device_type": 2 00:11:33.315 } 00:11:33.315 ], 00:11:33.315 "driver_specific": { 00:11:33.315 "passthru": { 00:11:33.315 "name": "pt1", 00:11:33.315 "base_bdev_name": "malloc1" 00:11:33.315 } 00:11:33.315 } 00:11:33.315 }' 00:11:33.315 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.315 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.574 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.833 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.833 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.833 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:33.833 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:34.092 "name": "pt2", 00:11:34.092 "aliases": [ 00:11:34.092 "00000000-0000-0000-0000-000000000002" 00:11:34.092 ], 00:11:34.092 "product_name": "passthru", 00:11:34.092 "block_size": 512, 00:11:34.092 "num_blocks": 65536, 00:11:34.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:34.092 "assigned_rate_limits": { 00:11:34.092 "rw_ios_per_sec": 0, 00:11:34.092 "rw_mbytes_per_sec": 0, 00:11:34.092 "r_mbytes_per_sec": 0, 00:11:34.092 "w_mbytes_per_sec": 0 00:11:34.092 }, 00:11:34.092 "claimed": true, 00:11:34.092 "claim_type": "exclusive_write", 00:11:34.092 "zoned": false, 00:11:34.092 "supported_io_types": { 00:11:34.092 "read": true, 00:11:34.092 "write": true, 00:11:34.092 "unmap": true, 00:11:34.092 "flush": true, 00:11:34.092 "reset": true, 00:11:34.092 "nvme_admin": false, 00:11:34.092 "nvme_io": false, 00:11:34.092 "nvme_io_md": false, 00:11:34.092 "write_zeroes": true, 00:11:34.092 "zcopy": true, 00:11:34.092 "get_zone_info": false, 00:11:34.092 "zone_management": false, 00:11:34.092 "zone_append": false, 00:11:34.092 "compare": false, 00:11:34.092 "compare_and_write": false, 00:11:34.092 "abort": true, 00:11:34.092 "seek_hole": false, 00:11:34.092 "seek_data": false, 00:11:34.092 "copy": true, 00:11:34.092 "nvme_iov_md": false 00:11:34.092 }, 00:11:34.092 "memory_domains": [ 00:11:34.092 { 00:11:34.092 "dma_device_id": "system", 00:11:34.092 "dma_device_type": 1 00:11:34.092 }, 00:11:34.092 { 00:11:34.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:34.092 "dma_device_type": 2 00:11:34.092 } 00:11:34.092 ], 00:11:34.092 "driver_specific": { 00:11:34.092 "passthru": { 00:11:34.092 "name": "pt2", 00:11:34.092 "base_bdev_name": "malloc2" 00:11:34.092 } 00:11:34.092 } 00:11:34.092 }' 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.092 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:34.350 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:34.609 [2024-07-15 13:30:13.863746] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 37a5944d-688e-479f-a23f-1309ec4b7a88 '!=' 37a5944d-688e-479f-a23f-1309ec4b7a88 ']' 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2076165 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2076165 ']' 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2076165 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2076165 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2076165' 00:11:34.609 killing process with pid 2076165 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2076165 00:11:34.609 [2024-07-15 13:30:13.935026] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:34.609 [2024-07-15 13:30:13.935084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:34.609 [2024-07-15 13:30:13.935128] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:34.609 [2024-07-15 13:30:13.935140] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb8ec0 name raid_bdev1, state offline 00:11:34.609 13:30:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2076165 00:11:34.609 [2024-07-15 13:30:13.952516] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.866 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:34.866 00:11:34.866 real 0m10.757s 00:11:34.866 user 0m19.258s 00:11:34.866 sys 0m1.926s 00:11:34.866 13:30:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:34.866 13:30:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.866 ************************************ 00:11:34.866 END TEST raid_superblock_test 00:11:34.866 ************************************ 00:11:34.866 13:30:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:34.866 13:30:14 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:11:34.866 13:30:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:34.866 13:30:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.866 13:30:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.866 ************************************ 00:11:34.866 START TEST raid_read_error_test 00:11:34.866 ************************************ 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:34.866 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.LjSl8pGGxp 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2078173 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2078173 /var/tmp/spdk-raid.sock 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2078173 ']' 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.867 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:34.867 13:30:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.125 [2024-07-15 13:30:14.314535] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:35.125 [2024-07-15 13:30:14.314601] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2078173 ] 00:11:35.126 [2024-07-15 13:30:14.444622] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:35.126 [2024-07-15 13:30:14.549257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:35.384 [2024-07-15 13:30:14.611869] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.384 [2024-07-15 13:30:14.611907] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.950 13:30:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:35.950 13:30:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:35.950 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:35.950 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:36.209 BaseBdev1_malloc 00:11:36.209 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:36.467 true 00:11:36.467 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:36.726 [2024-07-15 13:30:15.957381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:36.726 [2024-07-15 13:30:15.957427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.726 [2024-07-15 13:30:15.957451] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dad0d0 00:11:36.726 [2024-07-15 13:30:15.957464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.726 [2024-07-15 13:30:15.959382] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.726 [2024-07-15 13:30:15.959416] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:36.726 BaseBdev1 00:11:36.726 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:36.726 13:30:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:36.984 BaseBdev2_malloc 00:11:36.984 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:37.243 true 00:11:37.243 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:37.502 [2024-07-15 13:30:16.687921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:37.502 [2024-07-15 13:30:16.687971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.502 [2024-07-15 13:30:16.687994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db1910 00:11:37.502 [2024-07-15 13:30:16.688006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.502 [2024-07-15 13:30:16.689625] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.502 [2024-07-15 13:30:16.689654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:37.502 BaseBdev2 00:11:37.502 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:37.762 [2024-07-15 13:30:16.928597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:37.762 [2024-07-15 13:30:16.929973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:37.762 [2024-07-15 13:30:16.930175] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db3320 00:11:37.762 [2024-07-15 13:30:16.930189] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:37.762 [2024-07-15 13:30:16.930388] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db4290 00:11:37.762 [2024-07-15 13:30:16.930534] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db3320 00:11:37.762 [2024-07-15 13:30:16.930545] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db3320 00:11:37.762 [2024-07-15 13:30:16.930648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.762 13:30:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:38.021 13:30:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.022 "name": "raid_bdev1", 00:11:38.022 "uuid": "697cbd36-b7e6-4a18-80f2-462e266e9cba", 00:11:38.022 "strip_size_kb": 64, 00:11:38.022 "state": "online", 00:11:38.022 "raid_level": "concat", 00:11:38.022 "superblock": true, 00:11:38.022 "num_base_bdevs": 2, 00:11:38.022 "num_base_bdevs_discovered": 2, 00:11:38.022 "num_base_bdevs_operational": 2, 00:11:38.022 "base_bdevs_list": [ 00:11:38.022 { 00:11:38.022 "name": "BaseBdev1", 00:11:38.022 "uuid": "3f9eb745-6259-5d42-b391-2121e1d3450d", 00:11:38.022 "is_configured": true, 00:11:38.022 "data_offset": 2048, 00:11:38.022 "data_size": 63488 00:11:38.022 }, 00:11:38.022 { 00:11:38.022 "name": "BaseBdev2", 00:11:38.022 "uuid": "dcbb7242-d0b6-5470-bfba-37a68a28a701", 00:11:38.022 "is_configured": true, 00:11:38.022 "data_offset": 2048, 00:11:38.022 "data_size": 63488 00:11:38.022 } 00:11:38.022 ] 00:11:38.022 }' 00:11:38.022 13:30:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.022 13:30:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.590 13:30:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:38.590 13:30:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:38.590 [2024-07-15 13:30:17.867365] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dae9b0 00:11:39.527 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.786 13:30:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:39.786 13:30:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.787 "name": "raid_bdev1", 00:11:39.787 "uuid": "697cbd36-b7e6-4a18-80f2-462e266e9cba", 00:11:39.787 "strip_size_kb": 64, 00:11:39.787 "state": "online", 00:11:39.787 "raid_level": "concat", 00:11:39.787 "superblock": true, 00:11:39.787 "num_base_bdevs": 2, 00:11:39.787 "num_base_bdevs_discovered": 2, 00:11:39.787 "num_base_bdevs_operational": 2, 00:11:39.787 "base_bdevs_list": [ 00:11:39.787 { 00:11:39.787 "name": "BaseBdev1", 00:11:39.787 "uuid": "3f9eb745-6259-5d42-b391-2121e1d3450d", 00:11:39.787 "is_configured": true, 00:11:39.787 "data_offset": 2048, 00:11:39.787 "data_size": 63488 00:11:39.787 }, 00:11:39.787 { 00:11:39.787 "name": "BaseBdev2", 00:11:39.787 "uuid": "dcbb7242-d0b6-5470-bfba-37a68a28a701", 00:11:39.787 "is_configured": true, 00:11:39.787 "data_offset": 2048, 00:11:39.787 "data_size": 63488 00:11:39.787 } 00:11:39.787 ] 00:11:39.787 }' 00:11:39.787 13:30:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.787 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.356 13:30:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:40.615 [2024-07-15 13:30:19.926440] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:40.615 [2024-07-15 13:30:19.926482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:40.615 [2024-07-15 13:30:19.929658] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:40.615 [2024-07-15 13:30:19.929687] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:40.615 [2024-07-15 13:30:19.929715] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:40.615 [2024-07-15 13:30:19.929727] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db3320 name raid_bdev1, state offline 00:11:40.615 0 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2078173 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2078173 ']' 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2078173 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2078173 00:11:40.615 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:40.616 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:40.616 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2078173' 00:11:40.616 killing process with pid 2078173 00:11:40.616 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2078173 00:11:40.616 [2024-07-15 13:30:19.993987] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.616 13:30:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2078173 00:11:40.616 [2024-07-15 13:30:20.004627] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.LjSl8pGGxp 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:11:40.874 00:11:40.874 real 0m6.004s 00:11:40.874 user 0m9.319s 00:11:40.874 sys 0m1.053s 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.874 13:30:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.874 ************************************ 00:11:40.874 END TEST raid_read_error_test 00:11:40.874 ************************************ 00:11:40.874 13:30:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:40.874 13:30:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:11:40.874 13:30:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:40.874 13:30:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.874 13:30:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:41.133 ************************************ 00:11:41.133 START TEST raid_write_error_test 00:11:41.133 ************************************ 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mQTxbDaJMf 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2079142 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2079142 /var/tmp/spdk-raid.sock 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2079142 ']' 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:41.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:41.134 13:30:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.134 [2024-07-15 13:30:20.397157] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:41.134 [2024-07-15 13:30:20.397210] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2079142 ] 00:11:41.134 [2024-07-15 13:30:20.510742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:41.393 [2024-07-15 13:30:20.612515] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:41.393 [2024-07-15 13:30:20.675921] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.393 [2024-07-15 13:30:20.675965] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.961 13:30:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:41.961 13:30:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:41.961 13:30:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.961 13:30:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:42.220 BaseBdev1_malloc 00:11:42.220 13:30:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:42.479 true 00:11:42.479 13:30:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:42.738 [2024-07-15 13:30:21.992976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:42.738 [2024-07-15 13:30:21.993021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.738 [2024-07-15 13:30:21.993042] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20920d0 00:11:42.738 [2024-07-15 13:30:21.993054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.738 [2024-07-15 13:30:21.994847] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.738 [2024-07-15 13:30:21.994878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:42.738 BaseBdev1 00:11:42.738 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:42.738 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:42.997 BaseBdev2_malloc 00:11:42.997 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:43.256 true 00:11:43.256 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:43.515 [2024-07-15 13:30:22.743608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:43.515 [2024-07-15 13:30:22.743653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.515 [2024-07-15 13:30:22.743673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2096910 00:11:43.515 [2024-07-15 13:30:22.743686] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.515 [2024-07-15 13:30:22.745197] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.515 [2024-07-15 13:30:22.745226] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:43.515 BaseBdev2 00:11:43.515 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:43.515 [2024-07-15 13:30:22.928134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:43.515 [2024-07-15 13:30:22.929404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:43.515 [2024-07-15 13:30:22.929596] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2098320 00:11:43.515 [2024-07-15 13:30:22.929609] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:43.515 [2024-07-15 13:30:22.929806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2099290 00:11:43.515 [2024-07-15 13:30:22.929961] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2098320 00:11:43.515 [2024-07-15 13:30:22.929972] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2098320 00:11:43.515 [2024-07-15 13:30:22.930074] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.775 13:30:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.037 13:30:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.037 "name": "raid_bdev1", 00:11:44.037 "uuid": "0d8bd83b-a2e9-444a-869e-ea746caef7a5", 00:11:44.037 "strip_size_kb": 64, 00:11:44.037 "state": "online", 00:11:44.037 "raid_level": "concat", 00:11:44.037 "superblock": true, 00:11:44.037 "num_base_bdevs": 2, 00:11:44.037 "num_base_bdevs_discovered": 2, 00:11:44.037 "num_base_bdevs_operational": 2, 00:11:44.037 "base_bdevs_list": [ 00:11:44.037 { 00:11:44.037 "name": "BaseBdev1", 00:11:44.037 "uuid": "746fbe31-9c76-5bad-918d-97848002eaae", 00:11:44.037 "is_configured": true, 00:11:44.037 "data_offset": 2048, 00:11:44.037 "data_size": 63488 00:11:44.037 }, 00:11:44.037 { 00:11:44.037 "name": "BaseBdev2", 00:11:44.037 "uuid": "1d8c1f92-5342-519f-8405-3dc849b10838", 00:11:44.037 "is_configured": true, 00:11:44.037 "data_offset": 2048, 00:11:44.037 "data_size": 63488 00:11:44.037 } 00:11:44.037 ] 00:11:44.037 }' 00:11:44.037 13:30:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.037 13:30:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.604 13:30:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:44.604 13:30:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:44.604 [2024-07-15 13:30:23.878918] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20939b0 00:11:45.542 13:30:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.801 "name": "raid_bdev1", 00:11:45.801 "uuid": "0d8bd83b-a2e9-444a-869e-ea746caef7a5", 00:11:45.801 "strip_size_kb": 64, 00:11:45.801 "state": "online", 00:11:45.801 "raid_level": "concat", 00:11:45.801 "superblock": true, 00:11:45.801 "num_base_bdevs": 2, 00:11:45.801 "num_base_bdevs_discovered": 2, 00:11:45.801 "num_base_bdevs_operational": 2, 00:11:45.801 "base_bdevs_list": [ 00:11:45.801 { 00:11:45.801 "name": "BaseBdev1", 00:11:45.801 "uuid": "746fbe31-9c76-5bad-918d-97848002eaae", 00:11:45.801 "is_configured": true, 00:11:45.801 "data_offset": 2048, 00:11:45.801 "data_size": 63488 00:11:45.801 }, 00:11:45.801 { 00:11:45.801 "name": "BaseBdev2", 00:11:45.801 "uuid": "1d8c1f92-5342-519f-8405-3dc849b10838", 00:11:45.801 "is_configured": true, 00:11:45.801 "data_offset": 2048, 00:11:45.801 "data_size": 63488 00:11:45.801 } 00:11:45.801 ] 00:11:45.801 }' 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.801 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.368 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:46.627 [2024-07-15 13:30:25.896816] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:46.627 [2024-07-15 13:30:25.896861] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:46.627 [2024-07-15 13:30:25.900042] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:46.627 [2024-07-15 13:30:25.900074] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:46.627 [2024-07-15 13:30:25.900109] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:46.627 [2024-07-15 13:30:25.900120] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2098320 name raid_bdev1, state offline 00:11:46.627 0 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2079142 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2079142 ']' 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2079142 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2079142 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2079142' 00:11:46.627 killing process with pid 2079142 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2079142 00:11:46.627 [2024-07-15 13:30:25.963497] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:46.627 13:30:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2079142 00:11:46.627 [2024-07-15 13:30:25.974068] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mQTxbDaJMf 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:46.886 00:11:46.886 real 0m5.884s 00:11:46.886 user 0m9.092s 00:11:46.886 sys 0m1.050s 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:46.886 13:30:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.886 ************************************ 00:11:46.886 END TEST raid_write_error_test 00:11:46.886 ************************************ 00:11:46.886 13:30:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:46.886 13:30:26 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:46.886 13:30:26 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:46.886 13:30:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:46.886 13:30:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:46.886 13:30:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:46.886 ************************************ 00:11:46.886 START TEST raid_state_function_test 00:11:46.886 ************************************ 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2079951 00:11:46.886 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2079951' 00:11:46.886 Process raid pid: 2079951 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2079951 /var/tmp/spdk-raid.sock 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2079951 ']' 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:47.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:47.147 13:30:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.147 [2024-07-15 13:30:26.371948] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:47.147 [2024-07-15 13:30:26.372019] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:47.147 [2024-07-15 13:30:26.503594] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:47.468 [2024-07-15 13:30:26.608565] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:47.468 [2024-07-15 13:30:26.669590] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.468 [2024-07-15 13:30:26.669618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.033 13:30:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:48.033 13:30:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:48.033 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:48.292 [2024-07-15 13:30:27.459587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:48.292 [2024-07-15 13:30:27.459631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:48.292 [2024-07-15 13:30:27.459646] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:48.292 [2024-07-15 13:30:27.459658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.292 "name": "Existed_Raid", 00:11:48.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.292 "strip_size_kb": 0, 00:11:48.292 "state": "configuring", 00:11:48.292 "raid_level": "raid1", 00:11:48.292 "superblock": false, 00:11:48.292 "num_base_bdevs": 2, 00:11:48.292 "num_base_bdevs_discovered": 0, 00:11:48.292 "num_base_bdevs_operational": 2, 00:11:48.292 "base_bdevs_list": [ 00:11:48.292 { 00:11:48.292 "name": "BaseBdev1", 00:11:48.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.292 "is_configured": false, 00:11:48.292 "data_offset": 0, 00:11:48.292 "data_size": 0 00:11:48.292 }, 00:11:48.292 { 00:11:48.292 "name": "BaseBdev2", 00:11:48.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.292 "is_configured": false, 00:11:48.292 "data_offset": 0, 00:11:48.292 "data_size": 0 00:11:48.292 } 00:11:48.292 ] 00:11:48.292 }' 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.292 13:30:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.859 13:30:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:49.117 [2024-07-15 13:30:28.409981] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:49.117 [2024-07-15 13:30:28.410010] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacfa80 name Existed_Raid, state configuring 00:11:49.117 13:30:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:49.377 [2024-07-15 13:30:28.582446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:49.377 [2024-07-15 13:30:28.582472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:49.377 [2024-07-15 13:30:28.582482] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:49.377 [2024-07-15 13:30:28.582493] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:49.377 [2024-07-15 13:30:28.768890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:49.377 BaseBdev1 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:49.377 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:49.636 13:30:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:49.896 [ 00:11:49.896 { 00:11:49.896 "name": "BaseBdev1", 00:11:49.896 "aliases": [ 00:11:49.896 "dff3178f-5f68-4ed2-81fd-a427b609f358" 00:11:49.896 ], 00:11:49.896 "product_name": "Malloc disk", 00:11:49.896 "block_size": 512, 00:11:49.896 "num_blocks": 65536, 00:11:49.896 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:49.896 "assigned_rate_limits": { 00:11:49.896 "rw_ios_per_sec": 0, 00:11:49.896 "rw_mbytes_per_sec": 0, 00:11:49.896 "r_mbytes_per_sec": 0, 00:11:49.896 "w_mbytes_per_sec": 0 00:11:49.896 }, 00:11:49.896 "claimed": true, 00:11:49.896 "claim_type": "exclusive_write", 00:11:49.896 "zoned": false, 00:11:49.896 "supported_io_types": { 00:11:49.896 "read": true, 00:11:49.896 "write": true, 00:11:49.896 "unmap": true, 00:11:49.896 "flush": true, 00:11:49.896 "reset": true, 00:11:49.896 "nvme_admin": false, 00:11:49.896 "nvme_io": false, 00:11:49.896 "nvme_io_md": false, 00:11:49.896 "write_zeroes": true, 00:11:49.896 "zcopy": true, 00:11:49.896 "get_zone_info": false, 00:11:49.896 "zone_management": false, 00:11:49.896 "zone_append": false, 00:11:49.896 "compare": false, 00:11:49.896 "compare_and_write": false, 00:11:49.896 "abort": true, 00:11:49.896 "seek_hole": false, 00:11:49.896 "seek_data": false, 00:11:49.896 "copy": true, 00:11:49.896 "nvme_iov_md": false 00:11:49.896 }, 00:11:49.896 "memory_domains": [ 00:11:49.896 { 00:11:49.896 "dma_device_id": "system", 00:11:49.896 "dma_device_type": 1 00:11:49.896 }, 00:11:49.896 { 00:11:49.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.896 "dma_device_type": 2 00:11:49.896 } 00:11:49.896 ], 00:11:49.896 "driver_specific": {} 00:11:49.896 } 00:11:49.896 ] 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.896 "name": "Existed_Raid", 00:11:49.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.896 "strip_size_kb": 0, 00:11:49.896 "state": "configuring", 00:11:49.896 "raid_level": "raid1", 00:11:49.896 "superblock": false, 00:11:49.896 "num_base_bdevs": 2, 00:11:49.896 "num_base_bdevs_discovered": 1, 00:11:49.896 "num_base_bdevs_operational": 2, 00:11:49.896 "base_bdevs_list": [ 00:11:49.896 { 00:11:49.896 "name": "BaseBdev1", 00:11:49.896 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:49.896 "is_configured": true, 00:11:49.896 "data_offset": 0, 00:11:49.896 "data_size": 65536 00:11:49.896 }, 00:11:49.896 { 00:11:49.896 "name": "BaseBdev2", 00:11:49.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.896 "is_configured": false, 00:11:49.896 "data_offset": 0, 00:11:49.896 "data_size": 0 00:11:49.896 } 00:11:49.896 ] 00:11:49.896 }' 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.896 13:30:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.462 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:50.720 [2024-07-15 13:30:29.972075] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:50.720 [2024-07-15 13:30:29.972116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacf350 name Existed_Raid, state configuring 00:11:50.720 13:30:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:50.978 [2024-07-15 13:30:30.148572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:50.978 [2024-07-15 13:30:30.150077] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:50.978 [2024-07-15 13:30:30.150111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.978 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:51.237 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.237 "name": "Existed_Raid", 00:11:51.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:51.237 "strip_size_kb": 0, 00:11:51.237 "state": "configuring", 00:11:51.237 "raid_level": "raid1", 00:11:51.237 "superblock": false, 00:11:51.237 "num_base_bdevs": 2, 00:11:51.237 "num_base_bdevs_discovered": 1, 00:11:51.237 "num_base_bdevs_operational": 2, 00:11:51.237 "base_bdevs_list": [ 00:11:51.237 { 00:11:51.237 "name": "BaseBdev1", 00:11:51.237 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:51.237 "is_configured": true, 00:11:51.237 "data_offset": 0, 00:11:51.237 "data_size": 65536 00:11:51.237 }, 00:11:51.237 { 00:11:51.237 "name": "BaseBdev2", 00:11:51.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:51.237 "is_configured": false, 00:11:51.237 "data_offset": 0, 00:11:51.237 "data_size": 0 00:11:51.237 } 00:11:51.237 ] 00:11:51.237 }' 00:11:51.237 13:30:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.237 13:30:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.804 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:52.062 [2024-07-15 13:30:31.246812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:52.062 [2024-07-15 13:30:31.246857] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xad0000 00:11:52.062 [2024-07-15 13:30:31.246866] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:52.062 [2024-07-15 13:30:31.247065] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ea0c0 00:11:52.062 [2024-07-15 13:30:31.247184] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad0000 00:11:52.062 [2024-07-15 13:30:31.247194] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xad0000 00:11:52.062 [2024-07-15 13:30:31.247360] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:52.062 BaseBdev2 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:52.062 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:52.321 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:52.321 [ 00:11:52.321 { 00:11:52.321 "name": "BaseBdev2", 00:11:52.321 "aliases": [ 00:11:52.321 "d95c320e-8bfa-4a99-9a15-48f35aefd6e7" 00:11:52.321 ], 00:11:52.321 "product_name": "Malloc disk", 00:11:52.321 "block_size": 512, 00:11:52.321 "num_blocks": 65536, 00:11:52.321 "uuid": "d95c320e-8bfa-4a99-9a15-48f35aefd6e7", 00:11:52.321 "assigned_rate_limits": { 00:11:52.321 "rw_ios_per_sec": 0, 00:11:52.321 "rw_mbytes_per_sec": 0, 00:11:52.321 "r_mbytes_per_sec": 0, 00:11:52.321 "w_mbytes_per_sec": 0 00:11:52.321 }, 00:11:52.321 "claimed": true, 00:11:52.321 "claim_type": "exclusive_write", 00:11:52.321 "zoned": false, 00:11:52.321 "supported_io_types": { 00:11:52.321 "read": true, 00:11:52.321 "write": true, 00:11:52.321 "unmap": true, 00:11:52.321 "flush": true, 00:11:52.321 "reset": true, 00:11:52.321 "nvme_admin": false, 00:11:52.321 "nvme_io": false, 00:11:52.321 "nvme_io_md": false, 00:11:52.321 "write_zeroes": true, 00:11:52.321 "zcopy": true, 00:11:52.321 "get_zone_info": false, 00:11:52.321 "zone_management": false, 00:11:52.321 "zone_append": false, 00:11:52.321 "compare": false, 00:11:52.321 "compare_and_write": false, 00:11:52.321 "abort": true, 00:11:52.321 "seek_hole": false, 00:11:52.321 "seek_data": false, 00:11:52.321 "copy": true, 00:11:52.321 "nvme_iov_md": false 00:11:52.321 }, 00:11:52.321 "memory_domains": [ 00:11:52.321 { 00:11:52.321 "dma_device_id": "system", 00:11:52.321 "dma_device_type": 1 00:11:52.321 }, 00:11:52.321 { 00:11:52.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.321 "dma_device_type": 2 00:11:52.321 } 00:11:52.321 ], 00:11:52.321 "driver_specific": {} 00:11:52.321 } 00:11:52.321 ] 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.580 "name": "Existed_Raid", 00:11:52.580 "uuid": "8ea61c61-e278-4d47-b130-2e0def64a76c", 00:11:52.580 "strip_size_kb": 0, 00:11:52.580 "state": "online", 00:11:52.580 "raid_level": "raid1", 00:11:52.580 "superblock": false, 00:11:52.580 "num_base_bdevs": 2, 00:11:52.580 "num_base_bdevs_discovered": 2, 00:11:52.580 "num_base_bdevs_operational": 2, 00:11:52.580 "base_bdevs_list": [ 00:11:52.580 { 00:11:52.580 "name": "BaseBdev1", 00:11:52.580 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:52.580 "is_configured": true, 00:11:52.580 "data_offset": 0, 00:11:52.580 "data_size": 65536 00:11:52.580 }, 00:11:52.580 { 00:11:52.580 "name": "BaseBdev2", 00:11:52.580 "uuid": "d95c320e-8bfa-4a99-9a15-48f35aefd6e7", 00:11:52.580 "is_configured": true, 00:11:52.580 "data_offset": 0, 00:11:52.580 "data_size": 65536 00:11:52.580 } 00:11:52.580 ] 00:11:52.580 }' 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.580 13:30:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:53.147 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:53.405 [2024-07-15 13:30:32.775160] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:53.405 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:53.405 "name": "Existed_Raid", 00:11:53.405 "aliases": [ 00:11:53.405 "8ea61c61-e278-4d47-b130-2e0def64a76c" 00:11:53.405 ], 00:11:53.405 "product_name": "Raid Volume", 00:11:53.405 "block_size": 512, 00:11:53.405 "num_blocks": 65536, 00:11:53.405 "uuid": "8ea61c61-e278-4d47-b130-2e0def64a76c", 00:11:53.405 "assigned_rate_limits": { 00:11:53.405 "rw_ios_per_sec": 0, 00:11:53.405 "rw_mbytes_per_sec": 0, 00:11:53.405 "r_mbytes_per_sec": 0, 00:11:53.405 "w_mbytes_per_sec": 0 00:11:53.405 }, 00:11:53.405 "claimed": false, 00:11:53.405 "zoned": false, 00:11:53.405 "supported_io_types": { 00:11:53.405 "read": true, 00:11:53.405 "write": true, 00:11:53.405 "unmap": false, 00:11:53.405 "flush": false, 00:11:53.405 "reset": true, 00:11:53.405 "nvme_admin": false, 00:11:53.405 "nvme_io": false, 00:11:53.405 "nvme_io_md": false, 00:11:53.405 "write_zeroes": true, 00:11:53.405 "zcopy": false, 00:11:53.405 "get_zone_info": false, 00:11:53.405 "zone_management": false, 00:11:53.405 "zone_append": false, 00:11:53.405 "compare": false, 00:11:53.405 "compare_and_write": false, 00:11:53.405 "abort": false, 00:11:53.405 "seek_hole": false, 00:11:53.405 "seek_data": false, 00:11:53.405 "copy": false, 00:11:53.405 "nvme_iov_md": false 00:11:53.405 }, 00:11:53.405 "memory_domains": [ 00:11:53.405 { 00:11:53.405 "dma_device_id": "system", 00:11:53.405 "dma_device_type": 1 00:11:53.405 }, 00:11:53.405 { 00:11:53.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.405 "dma_device_type": 2 00:11:53.405 }, 00:11:53.405 { 00:11:53.405 "dma_device_id": "system", 00:11:53.405 "dma_device_type": 1 00:11:53.405 }, 00:11:53.405 { 00:11:53.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.405 "dma_device_type": 2 00:11:53.405 } 00:11:53.405 ], 00:11:53.405 "driver_specific": { 00:11:53.405 "raid": { 00:11:53.405 "uuid": "8ea61c61-e278-4d47-b130-2e0def64a76c", 00:11:53.405 "strip_size_kb": 0, 00:11:53.405 "state": "online", 00:11:53.405 "raid_level": "raid1", 00:11:53.405 "superblock": false, 00:11:53.405 "num_base_bdevs": 2, 00:11:53.405 "num_base_bdevs_discovered": 2, 00:11:53.405 "num_base_bdevs_operational": 2, 00:11:53.405 "base_bdevs_list": [ 00:11:53.405 { 00:11:53.405 "name": "BaseBdev1", 00:11:53.405 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:53.405 "is_configured": true, 00:11:53.405 "data_offset": 0, 00:11:53.405 "data_size": 65536 00:11:53.405 }, 00:11:53.405 { 00:11:53.405 "name": "BaseBdev2", 00:11:53.405 "uuid": "d95c320e-8bfa-4a99-9a15-48f35aefd6e7", 00:11:53.405 "is_configured": true, 00:11:53.405 "data_offset": 0, 00:11:53.405 "data_size": 65536 00:11:53.405 } 00:11:53.405 ] 00:11:53.405 } 00:11:53.405 } 00:11:53.405 }' 00:11:53.405 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:53.663 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:53.663 BaseBdev2' 00:11:53.663 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:53.663 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:53.663 13:30:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:53.926 "name": "BaseBdev1", 00:11:53.926 "aliases": [ 00:11:53.926 "dff3178f-5f68-4ed2-81fd-a427b609f358" 00:11:53.926 ], 00:11:53.926 "product_name": "Malloc disk", 00:11:53.926 "block_size": 512, 00:11:53.926 "num_blocks": 65536, 00:11:53.926 "uuid": "dff3178f-5f68-4ed2-81fd-a427b609f358", 00:11:53.926 "assigned_rate_limits": { 00:11:53.926 "rw_ios_per_sec": 0, 00:11:53.926 "rw_mbytes_per_sec": 0, 00:11:53.926 "r_mbytes_per_sec": 0, 00:11:53.926 "w_mbytes_per_sec": 0 00:11:53.926 }, 00:11:53.926 "claimed": true, 00:11:53.926 "claim_type": "exclusive_write", 00:11:53.926 "zoned": false, 00:11:53.926 "supported_io_types": { 00:11:53.926 "read": true, 00:11:53.926 "write": true, 00:11:53.926 "unmap": true, 00:11:53.926 "flush": true, 00:11:53.926 "reset": true, 00:11:53.926 "nvme_admin": false, 00:11:53.926 "nvme_io": false, 00:11:53.926 "nvme_io_md": false, 00:11:53.926 "write_zeroes": true, 00:11:53.926 "zcopy": true, 00:11:53.926 "get_zone_info": false, 00:11:53.926 "zone_management": false, 00:11:53.926 "zone_append": false, 00:11:53.926 "compare": false, 00:11:53.926 "compare_and_write": false, 00:11:53.926 "abort": true, 00:11:53.926 "seek_hole": false, 00:11:53.926 "seek_data": false, 00:11:53.926 "copy": true, 00:11:53.926 "nvme_iov_md": false 00:11:53.926 }, 00:11:53.926 "memory_domains": [ 00:11:53.926 { 00:11:53.926 "dma_device_id": "system", 00:11:53.926 "dma_device_type": 1 00:11:53.926 }, 00:11:53.926 { 00:11:53.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.926 "dma_device_type": 2 00:11:53.926 } 00:11:53.926 ], 00:11:53.926 "driver_specific": {} 00:11:53.926 }' 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:53.926 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.191 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.191 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.191 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:54.192 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:54.192 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:54.449 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:54.449 "name": "BaseBdev2", 00:11:54.449 "aliases": [ 00:11:54.449 "d95c320e-8bfa-4a99-9a15-48f35aefd6e7" 00:11:54.449 ], 00:11:54.449 "product_name": "Malloc disk", 00:11:54.449 "block_size": 512, 00:11:54.449 "num_blocks": 65536, 00:11:54.449 "uuid": "d95c320e-8bfa-4a99-9a15-48f35aefd6e7", 00:11:54.449 "assigned_rate_limits": { 00:11:54.449 "rw_ios_per_sec": 0, 00:11:54.449 "rw_mbytes_per_sec": 0, 00:11:54.449 "r_mbytes_per_sec": 0, 00:11:54.449 "w_mbytes_per_sec": 0 00:11:54.449 }, 00:11:54.449 "claimed": true, 00:11:54.449 "claim_type": "exclusive_write", 00:11:54.449 "zoned": false, 00:11:54.449 "supported_io_types": { 00:11:54.449 "read": true, 00:11:54.449 "write": true, 00:11:54.449 "unmap": true, 00:11:54.449 "flush": true, 00:11:54.449 "reset": true, 00:11:54.449 "nvme_admin": false, 00:11:54.449 "nvme_io": false, 00:11:54.449 "nvme_io_md": false, 00:11:54.449 "write_zeroes": true, 00:11:54.449 "zcopy": true, 00:11:54.449 "get_zone_info": false, 00:11:54.449 "zone_management": false, 00:11:54.449 "zone_append": false, 00:11:54.449 "compare": false, 00:11:54.449 "compare_and_write": false, 00:11:54.449 "abort": true, 00:11:54.449 "seek_hole": false, 00:11:54.449 "seek_data": false, 00:11:54.449 "copy": true, 00:11:54.449 "nvme_iov_md": false 00:11:54.449 }, 00:11:54.449 "memory_domains": [ 00:11:54.449 { 00:11:54.449 "dma_device_id": "system", 00:11:54.449 "dma_device_type": 1 00:11:54.449 }, 00:11:54.449 { 00:11:54.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.449 "dma_device_type": 2 00:11:54.450 } 00:11:54.450 ], 00:11:54.450 "driver_specific": {} 00:11:54.450 }' 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:54.450 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.708 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.708 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:54.708 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.708 13:30:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.708 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.708 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:54.966 [2024-07-15 13:30:34.262875] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.966 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.225 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.225 "name": "Existed_Raid", 00:11:55.225 "uuid": "8ea61c61-e278-4d47-b130-2e0def64a76c", 00:11:55.225 "strip_size_kb": 0, 00:11:55.225 "state": "online", 00:11:55.225 "raid_level": "raid1", 00:11:55.225 "superblock": false, 00:11:55.225 "num_base_bdevs": 2, 00:11:55.225 "num_base_bdevs_discovered": 1, 00:11:55.225 "num_base_bdevs_operational": 1, 00:11:55.225 "base_bdevs_list": [ 00:11:55.225 { 00:11:55.225 "name": null, 00:11:55.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.225 "is_configured": false, 00:11:55.225 "data_offset": 0, 00:11:55.225 "data_size": 65536 00:11:55.225 }, 00:11:55.225 { 00:11:55.225 "name": "BaseBdev2", 00:11:55.225 "uuid": "d95c320e-8bfa-4a99-9a15-48f35aefd6e7", 00:11:55.225 "is_configured": true, 00:11:55.225 "data_offset": 0, 00:11:55.225 "data_size": 65536 00:11:55.225 } 00:11:55.225 ] 00:11:55.225 }' 00:11:55.225 13:30:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.225 13:30:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.793 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:55.793 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:55.793 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.794 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:56.052 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:56.052 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:56.053 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:56.317 [2024-07-15 13:30:35.555316] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:56.317 [2024-07-15 13:30:35.555406] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:56.317 [2024-07-15 13:30:35.568063] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:56.317 [2024-07-15 13:30:35.568101] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:56.317 [2024-07-15 13:30:35.568114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad0000 name Existed_Raid, state offline 00:11:56.317 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:56.317 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:56.317 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.317 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2079951 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2079951 ']' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2079951 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2079951 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2079951' 00:11:56.577 killing process with pid 2079951 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2079951 00:11:56.577 [2024-07-15 13:30:35.864994] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.577 13:30:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2079951 00:11:56.577 [2024-07-15 13:30:35.865847] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:56.837 00:11:56.837 real 0m9.770s 00:11:56.837 user 0m17.382s 00:11:56.837 sys 0m1.818s 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.837 ************************************ 00:11:56.837 END TEST raid_state_function_test 00:11:56.837 ************************************ 00:11:56.837 13:30:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:56.837 13:30:36 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:56.837 13:30:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:56.837 13:30:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:56.837 13:30:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:56.837 ************************************ 00:11:56.837 START TEST raid_state_function_test_sb 00:11:56.837 ************************************ 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2081426 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2081426' 00:11:56.837 Process raid pid: 2081426 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2081426 /var/tmp/spdk-raid.sock 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2081426 ']' 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:56.837 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.837 13:30:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:56.837 [2024-07-15 13:30:36.207430] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:11:56.837 [2024-07-15 13:30:36.207499] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:57.096 [2024-07-15 13:30:36.327459] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:57.096 [2024-07-15 13:30:36.434033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.096 [2024-07-15 13:30:36.501973] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.096 [2024-07-15 13:30:36.502011] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.032 13:30:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.032 13:30:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:58.032 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:58.291 [2024-07-15 13:30:37.617994] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:58.291 [2024-07-15 13:30:37.618038] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:58.291 [2024-07-15 13:30:37.618049] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:58.291 [2024-07-15 13:30:37.618061] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.291 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.550 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.550 "name": "Existed_Raid", 00:11:58.550 "uuid": "17f66070-bfcb-41ef-aab1-8ca31cae9aaf", 00:11:58.550 "strip_size_kb": 0, 00:11:58.550 "state": "configuring", 00:11:58.550 "raid_level": "raid1", 00:11:58.550 "superblock": true, 00:11:58.550 "num_base_bdevs": 2, 00:11:58.550 "num_base_bdevs_discovered": 0, 00:11:58.550 "num_base_bdevs_operational": 2, 00:11:58.550 "base_bdevs_list": [ 00:11:58.550 { 00:11:58.550 "name": "BaseBdev1", 00:11:58.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.550 "is_configured": false, 00:11:58.550 "data_offset": 0, 00:11:58.550 "data_size": 0 00:11:58.550 }, 00:11:58.550 { 00:11:58.550 "name": "BaseBdev2", 00:11:58.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.550 "is_configured": false, 00:11:58.550 "data_offset": 0, 00:11:58.550 "data_size": 0 00:11:58.550 } 00:11:58.550 ] 00:11:58.550 }' 00:11:58.550 13:30:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.550 13:30:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:59.118 13:30:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:59.376 [2024-07-15 13:30:38.636544] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:59.376 [2024-07-15 13:30:38.636572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1106a80 name Existed_Raid, state configuring 00:11:59.376 13:30:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:59.636 [2024-07-15 13:30:38.813033] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:59.636 [2024-07-15 13:30:38.813060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:59.636 [2024-07-15 13:30:38.813069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:59.636 [2024-07-15 13:30:38.813081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:59.636 13:30:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:59.636 [2024-07-15 13:30:39.007445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:59.636 BaseBdev1 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:59.636 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:59.895 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:00.154 [ 00:12:00.154 { 00:12:00.154 "name": "BaseBdev1", 00:12:00.154 "aliases": [ 00:12:00.154 "595461b7-88bf-4a25-8695-75a953908ec3" 00:12:00.154 ], 00:12:00.154 "product_name": "Malloc disk", 00:12:00.154 "block_size": 512, 00:12:00.154 "num_blocks": 65536, 00:12:00.154 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:00.154 "assigned_rate_limits": { 00:12:00.154 "rw_ios_per_sec": 0, 00:12:00.154 "rw_mbytes_per_sec": 0, 00:12:00.154 "r_mbytes_per_sec": 0, 00:12:00.154 "w_mbytes_per_sec": 0 00:12:00.154 }, 00:12:00.154 "claimed": true, 00:12:00.154 "claim_type": "exclusive_write", 00:12:00.154 "zoned": false, 00:12:00.154 "supported_io_types": { 00:12:00.154 "read": true, 00:12:00.154 "write": true, 00:12:00.154 "unmap": true, 00:12:00.154 "flush": true, 00:12:00.154 "reset": true, 00:12:00.154 "nvme_admin": false, 00:12:00.154 "nvme_io": false, 00:12:00.154 "nvme_io_md": false, 00:12:00.154 "write_zeroes": true, 00:12:00.154 "zcopy": true, 00:12:00.154 "get_zone_info": false, 00:12:00.154 "zone_management": false, 00:12:00.154 "zone_append": false, 00:12:00.154 "compare": false, 00:12:00.154 "compare_and_write": false, 00:12:00.154 "abort": true, 00:12:00.154 "seek_hole": false, 00:12:00.154 "seek_data": false, 00:12:00.154 "copy": true, 00:12:00.154 "nvme_iov_md": false 00:12:00.154 }, 00:12:00.154 "memory_domains": [ 00:12:00.154 { 00:12:00.154 "dma_device_id": "system", 00:12:00.154 "dma_device_type": 1 00:12:00.154 }, 00:12:00.154 { 00:12:00.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.154 "dma_device_type": 2 00:12:00.154 } 00:12:00.154 ], 00:12:00.154 "driver_specific": {} 00:12:00.154 } 00:12:00.154 ] 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.154 "name": "Existed_Raid", 00:12:00.154 "uuid": "c7e31b78-ff6d-4f62-9f20-10e8930cbc97", 00:12:00.154 "strip_size_kb": 0, 00:12:00.154 "state": "configuring", 00:12:00.154 "raid_level": "raid1", 00:12:00.154 "superblock": true, 00:12:00.154 "num_base_bdevs": 2, 00:12:00.154 "num_base_bdevs_discovered": 1, 00:12:00.154 "num_base_bdevs_operational": 2, 00:12:00.154 "base_bdevs_list": [ 00:12:00.154 { 00:12:00.154 "name": "BaseBdev1", 00:12:00.154 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:00.154 "is_configured": true, 00:12:00.154 "data_offset": 2048, 00:12:00.154 "data_size": 63488 00:12:00.154 }, 00:12:00.154 { 00:12:00.154 "name": "BaseBdev2", 00:12:00.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.154 "is_configured": false, 00:12:00.154 "data_offset": 0, 00:12:00.154 "data_size": 0 00:12:00.154 } 00:12:00.154 ] 00:12:00.154 }' 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.154 13:30:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:00.722 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:00.980 [2024-07-15 13:30:40.290866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:00.980 [2024-07-15 13:30:40.290908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1106350 name Existed_Raid, state configuring 00:12:00.980 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:01.239 [2024-07-15 13:30:40.535545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:01.239 [2024-07-15 13:30:40.537099] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:01.239 [2024-07-15 13:30:40.537134] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.239 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:01.498 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.498 "name": "Existed_Raid", 00:12:01.498 "uuid": "a2b176db-bb2a-4bc2-aac4-ade614424e53", 00:12:01.498 "strip_size_kb": 0, 00:12:01.498 "state": "configuring", 00:12:01.498 "raid_level": "raid1", 00:12:01.498 "superblock": true, 00:12:01.498 "num_base_bdevs": 2, 00:12:01.498 "num_base_bdevs_discovered": 1, 00:12:01.498 "num_base_bdevs_operational": 2, 00:12:01.498 "base_bdevs_list": [ 00:12:01.498 { 00:12:01.498 "name": "BaseBdev1", 00:12:01.498 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:01.498 "is_configured": true, 00:12:01.498 "data_offset": 2048, 00:12:01.498 "data_size": 63488 00:12:01.498 }, 00:12:01.498 { 00:12:01.498 "name": "BaseBdev2", 00:12:01.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.498 "is_configured": false, 00:12:01.498 "data_offset": 0, 00:12:01.498 "data_size": 0 00:12:01.498 } 00:12:01.498 ] 00:12:01.498 }' 00:12:01.498 13:30:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.498 13:30:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:02.064 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:02.323 [2024-07-15 13:30:41.541536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:02.323 [2024-07-15 13:30:41.541684] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1107000 00:12:02.323 [2024-07-15 13:30:41.541698] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:02.323 [2024-07-15 13:30:41.541874] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10210c0 00:12:02.323 [2024-07-15 13:30:41.542006] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1107000 00:12:02.323 [2024-07-15 13:30:41.542017] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1107000 00:12:02.323 [2024-07-15 13:30:41.542110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.323 BaseBdev2 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:02.323 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:02.582 [ 00:12:02.582 { 00:12:02.582 "name": "BaseBdev2", 00:12:02.582 "aliases": [ 00:12:02.582 "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a" 00:12:02.582 ], 00:12:02.582 "product_name": "Malloc disk", 00:12:02.582 "block_size": 512, 00:12:02.582 "num_blocks": 65536, 00:12:02.582 "uuid": "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a", 00:12:02.582 "assigned_rate_limits": { 00:12:02.582 "rw_ios_per_sec": 0, 00:12:02.582 "rw_mbytes_per_sec": 0, 00:12:02.582 "r_mbytes_per_sec": 0, 00:12:02.582 "w_mbytes_per_sec": 0 00:12:02.582 }, 00:12:02.582 "claimed": true, 00:12:02.582 "claim_type": "exclusive_write", 00:12:02.582 "zoned": false, 00:12:02.582 "supported_io_types": { 00:12:02.582 "read": true, 00:12:02.582 "write": true, 00:12:02.582 "unmap": true, 00:12:02.582 "flush": true, 00:12:02.582 "reset": true, 00:12:02.582 "nvme_admin": false, 00:12:02.582 "nvme_io": false, 00:12:02.582 "nvme_io_md": false, 00:12:02.582 "write_zeroes": true, 00:12:02.582 "zcopy": true, 00:12:02.582 "get_zone_info": false, 00:12:02.582 "zone_management": false, 00:12:02.582 "zone_append": false, 00:12:02.582 "compare": false, 00:12:02.582 "compare_and_write": false, 00:12:02.582 "abort": true, 00:12:02.582 "seek_hole": false, 00:12:02.582 "seek_data": false, 00:12:02.582 "copy": true, 00:12:02.582 "nvme_iov_md": false 00:12:02.582 }, 00:12:02.582 "memory_domains": [ 00:12:02.582 { 00:12:02.582 "dma_device_id": "system", 00:12:02.582 "dma_device_type": 1 00:12:02.582 }, 00:12:02.582 { 00:12:02.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.582 "dma_device_type": 2 00:12:02.582 } 00:12:02.582 ], 00:12:02.582 "driver_specific": {} 00:12:02.582 } 00:12:02.582 ] 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.582 13:30:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.840 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.840 "name": "Existed_Raid", 00:12:02.840 "uuid": "a2b176db-bb2a-4bc2-aac4-ade614424e53", 00:12:02.840 "strip_size_kb": 0, 00:12:02.840 "state": "online", 00:12:02.840 "raid_level": "raid1", 00:12:02.840 "superblock": true, 00:12:02.840 "num_base_bdevs": 2, 00:12:02.840 "num_base_bdevs_discovered": 2, 00:12:02.840 "num_base_bdevs_operational": 2, 00:12:02.840 "base_bdevs_list": [ 00:12:02.840 { 00:12:02.840 "name": "BaseBdev1", 00:12:02.840 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:02.840 "is_configured": true, 00:12:02.840 "data_offset": 2048, 00:12:02.840 "data_size": 63488 00:12:02.840 }, 00:12:02.840 { 00:12:02.840 "name": "BaseBdev2", 00:12:02.840 "uuid": "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a", 00:12:02.840 "is_configured": true, 00:12:02.840 "data_offset": 2048, 00:12:02.840 "data_size": 63488 00:12:02.840 } 00:12:02.840 ] 00:12:02.840 }' 00:12:02.840 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.840 13:30:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:03.407 13:30:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:03.684 [2024-07-15 13:30:42.997779] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:03.684 "name": "Existed_Raid", 00:12:03.684 "aliases": [ 00:12:03.684 "a2b176db-bb2a-4bc2-aac4-ade614424e53" 00:12:03.684 ], 00:12:03.684 "product_name": "Raid Volume", 00:12:03.684 "block_size": 512, 00:12:03.684 "num_blocks": 63488, 00:12:03.684 "uuid": "a2b176db-bb2a-4bc2-aac4-ade614424e53", 00:12:03.684 "assigned_rate_limits": { 00:12:03.684 "rw_ios_per_sec": 0, 00:12:03.684 "rw_mbytes_per_sec": 0, 00:12:03.684 "r_mbytes_per_sec": 0, 00:12:03.684 "w_mbytes_per_sec": 0 00:12:03.684 }, 00:12:03.684 "claimed": false, 00:12:03.684 "zoned": false, 00:12:03.684 "supported_io_types": { 00:12:03.684 "read": true, 00:12:03.684 "write": true, 00:12:03.684 "unmap": false, 00:12:03.684 "flush": false, 00:12:03.684 "reset": true, 00:12:03.684 "nvme_admin": false, 00:12:03.684 "nvme_io": false, 00:12:03.684 "nvme_io_md": false, 00:12:03.684 "write_zeroes": true, 00:12:03.684 "zcopy": false, 00:12:03.684 "get_zone_info": false, 00:12:03.684 "zone_management": false, 00:12:03.684 "zone_append": false, 00:12:03.684 "compare": false, 00:12:03.684 "compare_and_write": false, 00:12:03.684 "abort": false, 00:12:03.684 "seek_hole": false, 00:12:03.684 "seek_data": false, 00:12:03.684 "copy": false, 00:12:03.684 "nvme_iov_md": false 00:12:03.684 }, 00:12:03.684 "memory_domains": [ 00:12:03.684 { 00:12:03.684 "dma_device_id": "system", 00:12:03.684 "dma_device_type": 1 00:12:03.684 }, 00:12:03.684 { 00:12:03.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.684 "dma_device_type": 2 00:12:03.684 }, 00:12:03.684 { 00:12:03.684 "dma_device_id": "system", 00:12:03.684 "dma_device_type": 1 00:12:03.684 }, 00:12:03.684 { 00:12:03.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.684 "dma_device_type": 2 00:12:03.684 } 00:12:03.684 ], 00:12:03.684 "driver_specific": { 00:12:03.684 "raid": { 00:12:03.684 "uuid": "a2b176db-bb2a-4bc2-aac4-ade614424e53", 00:12:03.684 "strip_size_kb": 0, 00:12:03.684 "state": "online", 00:12:03.684 "raid_level": "raid1", 00:12:03.684 "superblock": true, 00:12:03.684 "num_base_bdevs": 2, 00:12:03.684 "num_base_bdevs_discovered": 2, 00:12:03.684 "num_base_bdevs_operational": 2, 00:12:03.684 "base_bdevs_list": [ 00:12:03.684 { 00:12:03.684 "name": "BaseBdev1", 00:12:03.684 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:03.684 "is_configured": true, 00:12:03.684 "data_offset": 2048, 00:12:03.684 "data_size": 63488 00:12:03.684 }, 00:12:03.684 { 00:12:03.684 "name": "BaseBdev2", 00:12:03.684 "uuid": "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a", 00:12:03.684 "is_configured": true, 00:12:03.684 "data_offset": 2048, 00:12:03.684 "data_size": 63488 00:12:03.684 } 00:12:03.684 ] 00:12:03.684 } 00:12:03.684 } 00:12:03.684 }' 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:03.684 BaseBdev2' 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.684 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:03.951 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.951 "name": "BaseBdev1", 00:12:03.951 "aliases": [ 00:12:03.951 "595461b7-88bf-4a25-8695-75a953908ec3" 00:12:03.951 ], 00:12:03.951 "product_name": "Malloc disk", 00:12:03.951 "block_size": 512, 00:12:03.951 "num_blocks": 65536, 00:12:03.951 "uuid": "595461b7-88bf-4a25-8695-75a953908ec3", 00:12:03.951 "assigned_rate_limits": { 00:12:03.951 "rw_ios_per_sec": 0, 00:12:03.951 "rw_mbytes_per_sec": 0, 00:12:03.951 "r_mbytes_per_sec": 0, 00:12:03.951 "w_mbytes_per_sec": 0 00:12:03.951 }, 00:12:03.951 "claimed": true, 00:12:03.951 "claim_type": "exclusive_write", 00:12:03.951 "zoned": false, 00:12:03.951 "supported_io_types": { 00:12:03.951 "read": true, 00:12:03.951 "write": true, 00:12:03.951 "unmap": true, 00:12:03.951 "flush": true, 00:12:03.951 "reset": true, 00:12:03.951 "nvme_admin": false, 00:12:03.951 "nvme_io": false, 00:12:03.951 "nvme_io_md": false, 00:12:03.951 "write_zeroes": true, 00:12:03.951 "zcopy": true, 00:12:03.951 "get_zone_info": false, 00:12:03.951 "zone_management": false, 00:12:03.951 "zone_append": false, 00:12:03.951 "compare": false, 00:12:03.951 "compare_and_write": false, 00:12:03.951 "abort": true, 00:12:03.951 "seek_hole": false, 00:12:03.951 "seek_data": false, 00:12:03.951 "copy": true, 00:12:03.951 "nvme_iov_md": false 00:12:03.951 }, 00:12:03.951 "memory_domains": [ 00:12:03.951 { 00:12:03.951 "dma_device_id": "system", 00:12:03.951 "dma_device_type": 1 00:12:03.951 }, 00:12:03.951 { 00:12:03.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.951 "dma_device_type": 2 00:12:03.951 } 00:12:03.951 ], 00:12:03.951 "driver_specific": {} 00:12:03.951 }' 00:12:03.951 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.952 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.209 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.466 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.466 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.466 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.466 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:04.466 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.725 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.725 "name": "BaseBdev2", 00:12:04.725 "aliases": [ 00:12:04.725 "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a" 00:12:04.725 ], 00:12:04.725 "product_name": "Malloc disk", 00:12:04.725 "block_size": 512, 00:12:04.725 "num_blocks": 65536, 00:12:04.725 "uuid": "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a", 00:12:04.725 "assigned_rate_limits": { 00:12:04.725 "rw_ios_per_sec": 0, 00:12:04.725 "rw_mbytes_per_sec": 0, 00:12:04.725 "r_mbytes_per_sec": 0, 00:12:04.725 "w_mbytes_per_sec": 0 00:12:04.725 }, 00:12:04.725 "claimed": true, 00:12:04.725 "claim_type": "exclusive_write", 00:12:04.725 "zoned": false, 00:12:04.725 "supported_io_types": { 00:12:04.725 "read": true, 00:12:04.725 "write": true, 00:12:04.725 "unmap": true, 00:12:04.725 "flush": true, 00:12:04.725 "reset": true, 00:12:04.725 "nvme_admin": false, 00:12:04.725 "nvme_io": false, 00:12:04.725 "nvme_io_md": false, 00:12:04.725 "write_zeroes": true, 00:12:04.725 "zcopy": true, 00:12:04.725 "get_zone_info": false, 00:12:04.725 "zone_management": false, 00:12:04.725 "zone_append": false, 00:12:04.725 "compare": false, 00:12:04.725 "compare_and_write": false, 00:12:04.725 "abort": true, 00:12:04.725 "seek_hole": false, 00:12:04.725 "seek_data": false, 00:12:04.725 "copy": true, 00:12:04.725 "nvme_iov_md": false 00:12:04.725 }, 00:12:04.725 "memory_domains": [ 00:12:04.725 { 00:12:04.725 "dma_device_id": "system", 00:12:04.725 "dma_device_type": 1 00:12:04.725 }, 00:12:04.725 { 00:12:04.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.725 "dma_device_type": 2 00:12:04.725 } 00:12:04.725 ], 00:12:04.725 "driver_specific": {} 00:12:04.725 }' 00:12:04.725 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.725 13:30:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.725 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.725 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.725 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.984 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:05.243 [2024-07-15 13:30:44.545670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.243 13:30:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.809 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.809 "name": "Existed_Raid", 00:12:05.809 "uuid": "a2b176db-bb2a-4bc2-aac4-ade614424e53", 00:12:05.809 "strip_size_kb": 0, 00:12:05.809 "state": "online", 00:12:05.809 "raid_level": "raid1", 00:12:05.809 "superblock": true, 00:12:05.809 "num_base_bdevs": 2, 00:12:05.809 "num_base_bdevs_discovered": 1, 00:12:05.809 "num_base_bdevs_operational": 1, 00:12:05.809 "base_bdevs_list": [ 00:12:05.809 { 00:12:05.809 "name": null, 00:12:05.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.809 "is_configured": false, 00:12:05.809 "data_offset": 2048, 00:12:05.809 "data_size": 63488 00:12:05.809 }, 00:12:05.809 { 00:12:05.809 "name": "BaseBdev2", 00:12:05.809 "uuid": "e14b5ca9-af2c-4a8d-986b-b17b246a2c9a", 00:12:05.809 "is_configured": true, 00:12:05.809 "data_offset": 2048, 00:12:05.809 "data_size": 63488 00:12:05.809 } 00:12:05.809 ] 00:12:05.809 }' 00:12:05.809 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.809 13:30:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:06.376 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:06.376 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.376 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.376 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:06.634 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:06.634 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:06.634 13:30:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:06.893 [2024-07-15 13:30:46.179473] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:06.893 [2024-07-15 13:30:46.179559] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:06.893 [2024-07-15 13:30:46.192262] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:06.893 [2024-07-15 13:30:46.192294] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:06.893 [2024-07-15 13:30:46.192307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1107000 name Existed_Raid, state offline 00:12:06.893 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:06.893 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.893 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.893 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2081426 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2081426 ']' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2081426 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2081426 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2081426' 00:12:07.152 killing process with pid 2081426 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2081426 00:12:07.152 [2024-07-15 13:30:46.506907] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:07.152 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2081426 00:12:07.152 [2024-07-15 13:30:46.507793] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:07.411 13:30:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:07.411 00:12:07.411 real 0m10.582s 00:12:07.411 user 0m18.825s 00:12:07.411 sys 0m1.933s 00:12:07.411 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:07.411 13:30:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.411 ************************************ 00:12:07.411 END TEST raid_state_function_test_sb 00:12:07.411 ************************************ 00:12:07.411 13:30:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.411 13:30:46 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:07.411 13:30:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:07.411 13:30:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.411 13:30:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.411 ************************************ 00:12:07.411 START TEST raid_superblock_test 00:12:07.411 ************************************ 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2083050 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2083050 /var/tmp/spdk-raid.sock 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2083050 ']' 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.411 13:30:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.670 [2024-07-15 13:30:46.881088] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:12:07.670 [2024-07-15 13:30:46.881162] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2083050 ] 00:12:07.670 [2024-07-15 13:30:47.012695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.929 [2024-07-15 13:30:47.115258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.929 [2024-07-15 13:30:47.179846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.929 [2024-07-15 13:30:47.179888] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:08.496 13:30:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:08.754 malloc1 00:12:08.754 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:09.014 [2024-07-15 13:30:48.289932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:09.014 [2024-07-15 13:30:48.289984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.014 [2024-07-15 13:30:48.290004] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f3570 00:12:09.014 [2024-07-15 13:30:48.290017] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.014 [2024-07-15 13:30:48.291612] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.014 [2024-07-15 13:30:48.291642] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:09.014 pt1 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:09.014 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:09.272 malloc2 00:12:09.272 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:09.531 [2024-07-15 13:30:48.780067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:09.531 [2024-07-15 13:30:48.780117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.531 [2024-07-15 13:30:48.780135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f4970 00:12:09.531 [2024-07-15 13:30:48.780148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.531 [2024-07-15 13:30:48.781674] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.531 [2024-07-15 13:30:48.781705] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:09.531 pt2 00:12:09.531 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:09.531 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:09.531 13:30:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:09.789 [2024-07-15 13:30:49.028733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:09.789 [2024-07-15 13:30:49.029907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:09.789 [2024-07-15 13:30:49.030055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb97270 00:12:09.789 [2024-07-15 13:30:49.030069] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:09.789 [2024-07-15 13:30:49.030258] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9eb0e0 00:12:09.789 [2024-07-15 13:30:49.030400] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb97270 00:12:09.789 [2024-07-15 13:30:49.030410] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb97270 00:12:09.789 [2024-07-15 13:30:49.030502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.789 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:10.047 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.047 "name": "raid_bdev1", 00:12:10.047 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:10.047 "strip_size_kb": 0, 00:12:10.047 "state": "online", 00:12:10.047 "raid_level": "raid1", 00:12:10.047 "superblock": true, 00:12:10.047 "num_base_bdevs": 2, 00:12:10.047 "num_base_bdevs_discovered": 2, 00:12:10.047 "num_base_bdevs_operational": 2, 00:12:10.047 "base_bdevs_list": [ 00:12:10.047 { 00:12:10.047 "name": "pt1", 00:12:10.047 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:10.047 "is_configured": true, 00:12:10.047 "data_offset": 2048, 00:12:10.047 "data_size": 63488 00:12:10.047 }, 00:12:10.047 { 00:12:10.047 "name": "pt2", 00:12:10.047 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:10.047 "is_configured": true, 00:12:10.047 "data_offset": 2048, 00:12:10.047 "data_size": 63488 00:12:10.047 } 00:12:10.047 ] 00:12:10.047 }' 00:12:10.047 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.047 13:30:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:10.612 13:30:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:10.871 [2024-07-15 13:30:50.115861] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:10.871 "name": "raid_bdev1", 00:12:10.871 "aliases": [ 00:12:10.871 "b02a4093-8523-45bb-8758-23f6f3f77971" 00:12:10.871 ], 00:12:10.871 "product_name": "Raid Volume", 00:12:10.871 "block_size": 512, 00:12:10.871 "num_blocks": 63488, 00:12:10.871 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:10.871 "assigned_rate_limits": { 00:12:10.871 "rw_ios_per_sec": 0, 00:12:10.871 "rw_mbytes_per_sec": 0, 00:12:10.871 "r_mbytes_per_sec": 0, 00:12:10.871 "w_mbytes_per_sec": 0 00:12:10.871 }, 00:12:10.871 "claimed": false, 00:12:10.871 "zoned": false, 00:12:10.871 "supported_io_types": { 00:12:10.871 "read": true, 00:12:10.871 "write": true, 00:12:10.871 "unmap": false, 00:12:10.871 "flush": false, 00:12:10.871 "reset": true, 00:12:10.871 "nvme_admin": false, 00:12:10.871 "nvme_io": false, 00:12:10.871 "nvme_io_md": false, 00:12:10.871 "write_zeroes": true, 00:12:10.871 "zcopy": false, 00:12:10.871 "get_zone_info": false, 00:12:10.871 "zone_management": false, 00:12:10.871 "zone_append": false, 00:12:10.871 "compare": false, 00:12:10.871 "compare_and_write": false, 00:12:10.871 "abort": false, 00:12:10.871 "seek_hole": false, 00:12:10.871 "seek_data": false, 00:12:10.871 "copy": false, 00:12:10.871 "nvme_iov_md": false 00:12:10.871 }, 00:12:10.871 "memory_domains": [ 00:12:10.871 { 00:12:10.871 "dma_device_id": "system", 00:12:10.871 "dma_device_type": 1 00:12:10.871 }, 00:12:10.871 { 00:12:10.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.871 "dma_device_type": 2 00:12:10.871 }, 00:12:10.871 { 00:12:10.871 "dma_device_id": "system", 00:12:10.871 "dma_device_type": 1 00:12:10.871 }, 00:12:10.871 { 00:12:10.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.871 "dma_device_type": 2 00:12:10.871 } 00:12:10.871 ], 00:12:10.871 "driver_specific": { 00:12:10.871 "raid": { 00:12:10.871 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:10.871 "strip_size_kb": 0, 00:12:10.871 "state": "online", 00:12:10.871 "raid_level": "raid1", 00:12:10.871 "superblock": true, 00:12:10.871 "num_base_bdevs": 2, 00:12:10.871 "num_base_bdevs_discovered": 2, 00:12:10.871 "num_base_bdevs_operational": 2, 00:12:10.871 "base_bdevs_list": [ 00:12:10.871 { 00:12:10.871 "name": "pt1", 00:12:10.871 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:10.871 "is_configured": true, 00:12:10.871 "data_offset": 2048, 00:12:10.871 "data_size": 63488 00:12:10.871 }, 00:12:10.871 { 00:12:10.871 "name": "pt2", 00:12:10.871 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:10.871 "is_configured": true, 00:12:10.871 "data_offset": 2048, 00:12:10.871 "data_size": 63488 00:12:10.871 } 00:12:10.871 ] 00:12:10.871 } 00:12:10.871 } 00:12:10.871 }' 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:10.871 pt2' 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:10.871 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.129 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.129 "name": "pt1", 00:12:11.129 "aliases": [ 00:12:11.129 "00000000-0000-0000-0000-000000000001" 00:12:11.129 ], 00:12:11.129 "product_name": "passthru", 00:12:11.129 "block_size": 512, 00:12:11.129 "num_blocks": 65536, 00:12:11.129 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:11.129 "assigned_rate_limits": { 00:12:11.129 "rw_ios_per_sec": 0, 00:12:11.129 "rw_mbytes_per_sec": 0, 00:12:11.129 "r_mbytes_per_sec": 0, 00:12:11.129 "w_mbytes_per_sec": 0 00:12:11.129 }, 00:12:11.129 "claimed": true, 00:12:11.129 "claim_type": "exclusive_write", 00:12:11.129 "zoned": false, 00:12:11.129 "supported_io_types": { 00:12:11.129 "read": true, 00:12:11.129 "write": true, 00:12:11.129 "unmap": true, 00:12:11.129 "flush": true, 00:12:11.129 "reset": true, 00:12:11.129 "nvme_admin": false, 00:12:11.129 "nvme_io": false, 00:12:11.129 "nvme_io_md": false, 00:12:11.129 "write_zeroes": true, 00:12:11.129 "zcopy": true, 00:12:11.129 "get_zone_info": false, 00:12:11.129 "zone_management": false, 00:12:11.129 "zone_append": false, 00:12:11.129 "compare": false, 00:12:11.129 "compare_and_write": false, 00:12:11.129 "abort": true, 00:12:11.129 "seek_hole": false, 00:12:11.130 "seek_data": false, 00:12:11.130 "copy": true, 00:12:11.130 "nvme_iov_md": false 00:12:11.130 }, 00:12:11.130 "memory_domains": [ 00:12:11.130 { 00:12:11.130 "dma_device_id": "system", 00:12:11.130 "dma_device_type": 1 00:12:11.130 }, 00:12:11.130 { 00:12:11.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.130 "dma_device_type": 2 00:12:11.130 } 00:12:11.130 ], 00:12:11.130 "driver_specific": { 00:12:11.130 "passthru": { 00:12:11.130 "name": "pt1", 00:12:11.130 "base_bdev_name": "malloc1" 00:12:11.130 } 00:12:11.130 } 00:12:11.130 }' 00:12:11.130 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.130 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.130 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.130 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.130 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.388 13:30:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:11.646 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.646 "name": "pt2", 00:12:11.646 "aliases": [ 00:12:11.646 "00000000-0000-0000-0000-000000000002" 00:12:11.646 ], 00:12:11.646 "product_name": "passthru", 00:12:11.646 "block_size": 512, 00:12:11.646 "num_blocks": 65536, 00:12:11.646 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:11.646 "assigned_rate_limits": { 00:12:11.646 "rw_ios_per_sec": 0, 00:12:11.646 "rw_mbytes_per_sec": 0, 00:12:11.646 "r_mbytes_per_sec": 0, 00:12:11.646 "w_mbytes_per_sec": 0 00:12:11.646 }, 00:12:11.646 "claimed": true, 00:12:11.646 "claim_type": "exclusive_write", 00:12:11.646 "zoned": false, 00:12:11.646 "supported_io_types": { 00:12:11.646 "read": true, 00:12:11.646 "write": true, 00:12:11.646 "unmap": true, 00:12:11.646 "flush": true, 00:12:11.646 "reset": true, 00:12:11.646 "nvme_admin": false, 00:12:11.646 "nvme_io": false, 00:12:11.646 "nvme_io_md": false, 00:12:11.646 "write_zeroes": true, 00:12:11.646 "zcopy": true, 00:12:11.646 "get_zone_info": false, 00:12:11.646 "zone_management": false, 00:12:11.646 "zone_append": false, 00:12:11.646 "compare": false, 00:12:11.646 "compare_and_write": false, 00:12:11.646 "abort": true, 00:12:11.646 "seek_hole": false, 00:12:11.646 "seek_data": false, 00:12:11.646 "copy": true, 00:12:11.646 "nvme_iov_md": false 00:12:11.646 }, 00:12:11.646 "memory_domains": [ 00:12:11.646 { 00:12:11.646 "dma_device_id": "system", 00:12:11.646 "dma_device_type": 1 00:12:11.646 }, 00:12:11.646 { 00:12:11.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.646 "dma_device_type": 2 00:12:11.646 } 00:12:11.646 ], 00:12:11.646 "driver_specific": { 00:12:11.646 "passthru": { 00:12:11.646 "name": "pt2", 00:12:11.646 "base_bdev_name": "malloc2" 00:12:11.646 } 00:12:11.646 } 00:12:11.646 }' 00:12:11.646 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.905 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:12.164 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.164 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.164 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.164 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:12.164 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:12.422 [2024-07-15 13:30:51.647884] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.422 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b02a4093-8523-45bb-8758-23f6f3f77971 00:12:12.422 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b02a4093-8523-45bb-8758-23f6f3f77971 ']' 00:12:12.422 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.681 [2024-07-15 13:30:51.896315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.681 [2024-07-15 13:30:51.896338] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.681 [2024-07-15 13:30:51.896398] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.681 [2024-07-15 13:30:51.896459] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.681 [2024-07-15 13:30:51.896472] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb97270 name raid_bdev1, state offline 00:12:12.681 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.681 13:30:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:12.681 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:12.681 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:12.681 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:12.681 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:12.939 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:12.939 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:13.197 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:13.197 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:13.775 13:30:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:13.775 [2024-07-15 13:30:53.191687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:13.775 [2024-07-15 13:30:53.193106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:13.775 [2024-07-15 13:30:53.193164] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:13.775 [2024-07-15 13:30:53.193214] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:13.775 [2024-07-15 13:30:53.193232] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:13.775 [2024-07-15 13:30:53.193242] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb96ff0 name raid_bdev1, state configuring 00:12:13.775 request: 00:12:13.775 { 00:12:13.775 "name": "raid_bdev1", 00:12:13.775 "raid_level": "raid1", 00:12:13.775 "base_bdevs": [ 00:12:13.775 "malloc1", 00:12:13.775 "malloc2" 00:12:13.775 ], 00:12:13.775 "superblock": false, 00:12:13.775 "method": "bdev_raid_create", 00:12:13.775 "req_id": 1 00:12:13.775 } 00:12:13.775 Got JSON-RPC error response 00:12:13.775 response: 00:12:13.775 { 00:12:13.775 "code": -17, 00:12:13.775 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:13.775 } 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:14.033 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:14.292 [2024-07-15 13:30:53.664887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:14.292 [2024-07-15 13:30:53.664936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.292 [2024-07-15 13:30:53.664957] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f37a0 00:12:14.292 [2024-07-15 13:30:53.664970] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.292 [2024-07-15 13:30:53.666564] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.292 [2024-07-15 13:30:53.666596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:14.292 [2024-07-15 13:30:53.666661] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:14.292 [2024-07-15 13:30:53.666687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:14.292 pt1 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.292 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.550 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.550 "name": "raid_bdev1", 00:12:14.550 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:14.550 "strip_size_kb": 0, 00:12:14.550 "state": "configuring", 00:12:14.550 "raid_level": "raid1", 00:12:14.550 "superblock": true, 00:12:14.550 "num_base_bdevs": 2, 00:12:14.550 "num_base_bdevs_discovered": 1, 00:12:14.550 "num_base_bdevs_operational": 2, 00:12:14.550 "base_bdevs_list": [ 00:12:14.550 { 00:12:14.550 "name": "pt1", 00:12:14.550 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:14.550 "is_configured": true, 00:12:14.550 "data_offset": 2048, 00:12:14.550 "data_size": 63488 00:12:14.550 }, 00:12:14.550 { 00:12:14.550 "name": null, 00:12:14.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.550 "is_configured": false, 00:12:14.550 "data_offset": 2048, 00:12:14.550 "data_size": 63488 00:12:14.550 } 00:12:14.550 ] 00:12:14.550 }' 00:12:14.550 13:30:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.550 13:30:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.117 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:15.117 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:15.117 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:15.117 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:15.375 [2024-07-15 13:30:54.727701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:15.375 [2024-07-15 13:30:54.727753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.375 [2024-07-15 13:30:54.727771] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb8b6f0 00:12:15.375 [2024-07-15 13:30:54.727784] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.375 [2024-07-15 13:30:54.728149] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.375 [2024-07-15 13:30:54.728170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:15.375 [2024-07-15 13:30:54.728232] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:15.375 [2024-07-15 13:30:54.728251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:15.375 [2024-07-15 13:30:54.728356] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb8c590 00:12:15.375 [2024-07-15 13:30:54.728367] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:15.375 [2024-07-15 13:30:54.728536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ed540 00:12:15.375 [2024-07-15 13:30:54.728662] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb8c590 00:12:15.375 [2024-07-15 13:30:54.728672] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb8c590 00:12:15.375 [2024-07-15 13:30:54.728766] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.375 pt2 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.375 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.633 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.633 "name": "raid_bdev1", 00:12:15.633 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:15.633 "strip_size_kb": 0, 00:12:15.633 "state": "online", 00:12:15.633 "raid_level": "raid1", 00:12:15.633 "superblock": true, 00:12:15.633 "num_base_bdevs": 2, 00:12:15.633 "num_base_bdevs_discovered": 2, 00:12:15.633 "num_base_bdevs_operational": 2, 00:12:15.633 "base_bdevs_list": [ 00:12:15.633 { 00:12:15.633 "name": "pt1", 00:12:15.633 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:15.633 "is_configured": true, 00:12:15.633 "data_offset": 2048, 00:12:15.633 "data_size": 63488 00:12:15.633 }, 00:12:15.633 { 00:12:15.633 "name": "pt2", 00:12:15.633 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:15.633 "is_configured": true, 00:12:15.633 "data_offset": 2048, 00:12:15.633 "data_size": 63488 00:12:15.633 } 00:12:15.633 ] 00:12:15.633 }' 00:12:15.633 13:30:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.633 13:30:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:16.569 13:30:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:16.826 [2024-07-15 13:30:56.095583] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:16.826 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:16.826 "name": "raid_bdev1", 00:12:16.826 "aliases": [ 00:12:16.826 "b02a4093-8523-45bb-8758-23f6f3f77971" 00:12:16.826 ], 00:12:16.826 "product_name": "Raid Volume", 00:12:16.826 "block_size": 512, 00:12:16.826 "num_blocks": 63488, 00:12:16.826 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:16.826 "assigned_rate_limits": { 00:12:16.826 "rw_ios_per_sec": 0, 00:12:16.826 "rw_mbytes_per_sec": 0, 00:12:16.826 "r_mbytes_per_sec": 0, 00:12:16.826 "w_mbytes_per_sec": 0 00:12:16.826 }, 00:12:16.826 "claimed": false, 00:12:16.826 "zoned": false, 00:12:16.826 "supported_io_types": { 00:12:16.826 "read": true, 00:12:16.826 "write": true, 00:12:16.826 "unmap": false, 00:12:16.826 "flush": false, 00:12:16.826 "reset": true, 00:12:16.826 "nvme_admin": false, 00:12:16.826 "nvme_io": false, 00:12:16.826 "nvme_io_md": false, 00:12:16.826 "write_zeroes": true, 00:12:16.826 "zcopy": false, 00:12:16.826 "get_zone_info": false, 00:12:16.826 "zone_management": false, 00:12:16.826 "zone_append": false, 00:12:16.826 "compare": false, 00:12:16.826 "compare_and_write": false, 00:12:16.826 "abort": false, 00:12:16.826 "seek_hole": false, 00:12:16.826 "seek_data": false, 00:12:16.826 "copy": false, 00:12:16.826 "nvme_iov_md": false 00:12:16.826 }, 00:12:16.826 "memory_domains": [ 00:12:16.826 { 00:12:16.826 "dma_device_id": "system", 00:12:16.826 "dma_device_type": 1 00:12:16.826 }, 00:12:16.826 { 00:12:16.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.826 "dma_device_type": 2 00:12:16.826 }, 00:12:16.826 { 00:12:16.826 "dma_device_id": "system", 00:12:16.826 "dma_device_type": 1 00:12:16.826 }, 00:12:16.826 { 00:12:16.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.826 "dma_device_type": 2 00:12:16.826 } 00:12:16.826 ], 00:12:16.826 "driver_specific": { 00:12:16.826 "raid": { 00:12:16.826 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:16.826 "strip_size_kb": 0, 00:12:16.826 "state": "online", 00:12:16.826 "raid_level": "raid1", 00:12:16.826 "superblock": true, 00:12:16.826 "num_base_bdevs": 2, 00:12:16.826 "num_base_bdevs_discovered": 2, 00:12:16.826 "num_base_bdevs_operational": 2, 00:12:16.826 "base_bdevs_list": [ 00:12:16.826 { 00:12:16.826 "name": "pt1", 00:12:16.826 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:16.826 "is_configured": true, 00:12:16.826 "data_offset": 2048, 00:12:16.826 "data_size": 63488 00:12:16.826 }, 00:12:16.826 { 00:12:16.826 "name": "pt2", 00:12:16.826 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:16.826 "is_configured": true, 00:12:16.826 "data_offset": 2048, 00:12:16.827 "data_size": 63488 00:12:16.827 } 00:12:16.827 ] 00:12:16.827 } 00:12:16.827 } 00:12:16.827 }' 00:12:16.827 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:16.827 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:16.827 pt2' 00:12:16.827 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:16.827 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:16.827 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:17.085 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:17.085 "name": "pt1", 00:12:17.085 "aliases": [ 00:12:17.085 "00000000-0000-0000-0000-000000000001" 00:12:17.085 ], 00:12:17.085 "product_name": "passthru", 00:12:17.085 "block_size": 512, 00:12:17.085 "num_blocks": 65536, 00:12:17.085 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:17.085 "assigned_rate_limits": { 00:12:17.085 "rw_ios_per_sec": 0, 00:12:17.085 "rw_mbytes_per_sec": 0, 00:12:17.085 "r_mbytes_per_sec": 0, 00:12:17.085 "w_mbytes_per_sec": 0 00:12:17.085 }, 00:12:17.085 "claimed": true, 00:12:17.085 "claim_type": "exclusive_write", 00:12:17.085 "zoned": false, 00:12:17.085 "supported_io_types": { 00:12:17.085 "read": true, 00:12:17.085 "write": true, 00:12:17.085 "unmap": true, 00:12:17.085 "flush": true, 00:12:17.085 "reset": true, 00:12:17.085 "nvme_admin": false, 00:12:17.085 "nvme_io": false, 00:12:17.085 "nvme_io_md": false, 00:12:17.085 "write_zeroes": true, 00:12:17.085 "zcopy": true, 00:12:17.085 "get_zone_info": false, 00:12:17.085 "zone_management": false, 00:12:17.085 "zone_append": false, 00:12:17.085 "compare": false, 00:12:17.085 "compare_and_write": false, 00:12:17.085 "abort": true, 00:12:17.085 "seek_hole": false, 00:12:17.085 "seek_data": false, 00:12:17.085 "copy": true, 00:12:17.085 "nvme_iov_md": false 00:12:17.085 }, 00:12:17.085 "memory_domains": [ 00:12:17.085 { 00:12:17.085 "dma_device_id": "system", 00:12:17.085 "dma_device_type": 1 00:12:17.085 }, 00:12:17.085 { 00:12:17.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.085 "dma_device_type": 2 00:12:17.085 } 00:12:17.085 ], 00:12:17.085 "driver_specific": { 00:12:17.085 "passthru": { 00:12:17.085 "name": "pt1", 00:12:17.085 "base_bdev_name": "malloc1" 00:12:17.085 } 00:12:17.085 } 00:12:17.085 }' 00:12:17.085 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.085 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.085 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:17.085 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.344 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.603 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:17.603 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:17.603 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:17.603 13:30:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:17.603 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:17.603 "name": "pt2", 00:12:17.603 "aliases": [ 00:12:17.603 "00000000-0000-0000-0000-000000000002" 00:12:17.603 ], 00:12:17.604 "product_name": "passthru", 00:12:17.604 "block_size": 512, 00:12:17.604 "num_blocks": 65536, 00:12:17.604 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:17.604 "assigned_rate_limits": { 00:12:17.604 "rw_ios_per_sec": 0, 00:12:17.604 "rw_mbytes_per_sec": 0, 00:12:17.604 "r_mbytes_per_sec": 0, 00:12:17.604 "w_mbytes_per_sec": 0 00:12:17.604 }, 00:12:17.604 "claimed": true, 00:12:17.604 "claim_type": "exclusive_write", 00:12:17.604 "zoned": false, 00:12:17.604 "supported_io_types": { 00:12:17.604 "read": true, 00:12:17.604 "write": true, 00:12:17.604 "unmap": true, 00:12:17.604 "flush": true, 00:12:17.604 "reset": true, 00:12:17.604 "nvme_admin": false, 00:12:17.604 "nvme_io": false, 00:12:17.604 "nvme_io_md": false, 00:12:17.604 "write_zeroes": true, 00:12:17.604 "zcopy": true, 00:12:17.604 "get_zone_info": false, 00:12:17.604 "zone_management": false, 00:12:17.604 "zone_append": false, 00:12:17.604 "compare": false, 00:12:17.604 "compare_and_write": false, 00:12:17.604 "abort": true, 00:12:17.604 "seek_hole": false, 00:12:17.604 "seek_data": false, 00:12:17.604 "copy": true, 00:12:17.604 "nvme_iov_md": false 00:12:17.604 }, 00:12:17.604 "memory_domains": [ 00:12:17.604 { 00:12:17.604 "dma_device_id": "system", 00:12:17.604 "dma_device_type": 1 00:12:17.604 }, 00:12:17.604 { 00:12:17.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.604 "dma_device_type": 2 00:12:17.604 } 00:12:17.604 ], 00:12:17.604 "driver_specific": { 00:12:17.604 "passthru": { 00:12:17.604 "name": "pt2", 00:12:17.604 "base_bdev_name": "malloc2" 00:12:17.604 } 00:12:17.604 } 00:12:17.604 }' 00:12:17.604 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.862 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.121 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.121 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:18.121 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:18.121 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:18.380 [2024-07-15 13:30:57.591552] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:18.380 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b02a4093-8523-45bb-8758-23f6f3f77971 '!=' b02a4093-8523-45bb-8758-23f6f3f77971 ']' 00:12:18.380 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:18.380 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:18.380 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:18.380 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:18.639 [2024-07-15 13:30:57.839997] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.639 13:30:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:18.897 13:30:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.897 "name": "raid_bdev1", 00:12:18.897 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:18.897 "strip_size_kb": 0, 00:12:18.897 "state": "online", 00:12:18.897 "raid_level": "raid1", 00:12:18.897 "superblock": true, 00:12:18.897 "num_base_bdevs": 2, 00:12:18.897 "num_base_bdevs_discovered": 1, 00:12:18.897 "num_base_bdevs_operational": 1, 00:12:18.897 "base_bdevs_list": [ 00:12:18.897 { 00:12:18.897 "name": null, 00:12:18.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.897 "is_configured": false, 00:12:18.897 "data_offset": 2048, 00:12:18.897 "data_size": 63488 00:12:18.897 }, 00:12:18.897 { 00:12:18.897 "name": "pt2", 00:12:18.897 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:18.897 "is_configured": true, 00:12:18.897 "data_offset": 2048, 00:12:18.897 "data_size": 63488 00:12:18.897 } 00:12:18.897 ] 00:12:18.897 }' 00:12:18.897 13:30:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.897 13:30:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.464 13:30:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:19.723 [2024-07-15 13:30:58.922841] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:19.723 [2024-07-15 13:30:58.922869] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:19.723 [2024-07-15 13:30:58.922930] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:19.723 [2024-07-15 13:30:58.922975] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:19.723 [2024-07-15 13:30:58.922987] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb8c590 name raid_bdev1, state offline 00:12:19.723 13:30:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.723 13:30:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:19.981 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:19.981 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:19.981 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:19.981 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:19.981 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:20.318 [2024-07-15 13:30:59.640699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:20.318 [2024-07-15 13:30:59.640747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:20.318 [2024-07-15 13:30:59.640765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f4160 00:12:20.318 [2024-07-15 13:30:59.640778] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:20.318 [2024-07-15 13:30:59.642448] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:20.318 [2024-07-15 13:30:59.642483] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:20.318 [2024-07-15 13:30:59.642553] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:20.318 [2024-07-15 13:30:59.642583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:20.318 [2024-07-15 13:30:59.642677] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9ea380 00:12:20.318 [2024-07-15 13:30:59.642689] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:20.318 [2024-07-15 13:30:59.642866] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9eba80 00:12:20.318 [2024-07-15 13:30:59.643009] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9ea380 00:12:20.318 [2024-07-15 13:30:59.643020] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9ea380 00:12:20.318 [2024-07-15 13:30:59.643116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:20.318 pt2 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:20.318 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.319 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.319 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.319 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.319 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.319 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:20.589 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.589 "name": "raid_bdev1", 00:12:20.589 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:20.589 "strip_size_kb": 0, 00:12:20.589 "state": "online", 00:12:20.589 "raid_level": "raid1", 00:12:20.589 "superblock": true, 00:12:20.589 "num_base_bdevs": 2, 00:12:20.589 "num_base_bdevs_discovered": 1, 00:12:20.589 "num_base_bdevs_operational": 1, 00:12:20.589 "base_bdevs_list": [ 00:12:20.589 { 00:12:20.589 "name": null, 00:12:20.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.589 "is_configured": false, 00:12:20.589 "data_offset": 2048, 00:12:20.590 "data_size": 63488 00:12:20.590 }, 00:12:20.590 { 00:12:20.590 "name": "pt2", 00:12:20.590 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:20.590 "is_configured": true, 00:12:20.590 "data_offset": 2048, 00:12:20.590 "data_size": 63488 00:12:20.590 } 00:12:20.590 ] 00:12:20.590 }' 00:12:20.590 13:30:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.590 13:30:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.156 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:21.415 [2024-07-15 13:31:00.723573] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:21.415 [2024-07-15 13:31:00.723601] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:21.415 [2024-07-15 13:31:00.723656] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:21.415 [2024-07-15 13:31:00.723703] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:21.415 [2024-07-15 13:31:00.723715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9ea380 name raid_bdev1, state offline 00:12:21.415 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.415 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:21.673 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:21.673 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:21.674 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:21.674 13:31:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:21.933 [2024-07-15 13:31:01.212844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:21.933 [2024-07-15 13:31:01.212901] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:21.933 [2024-07-15 13:31:01.212920] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb96520 00:12:21.933 [2024-07-15 13:31:01.212939] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:21.933 [2024-07-15 13:31:01.214583] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:21.933 [2024-07-15 13:31:01.214616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:21.933 [2024-07-15 13:31:01.214686] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:21.933 [2024-07-15 13:31:01.214713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:21.933 [2024-07-15 13:31:01.214814] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:21.933 [2024-07-15 13:31:01.214827] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:21.933 [2024-07-15 13:31:01.214841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9eb3f0 name raid_bdev1, state configuring 00:12:21.933 [2024-07-15 13:31:01.214864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:21.933 [2024-07-15 13:31:01.214938] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9ed2b0 00:12:21.933 [2024-07-15 13:31:01.214949] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:21.933 [2024-07-15 13:31:01.215116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ea350 00:12:21.933 [2024-07-15 13:31:01.215240] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9ed2b0 00:12:21.933 [2024-07-15 13:31:01.215250] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9ed2b0 00:12:21.933 [2024-07-15 13:31:01.215348] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:21.933 pt1 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.933 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:22.192 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.192 "name": "raid_bdev1", 00:12:22.192 "uuid": "b02a4093-8523-45bb-8758-23f6f3f77971", 00:12:22.192 "strip_size_kb": 0, 00:12:22.192 "state": "online", 00:12:22.192 "raid_level": "raid1", 00:12:22.192 "superblock": true, 00:12:22.192 "num_base_bdevs": 2, 00:12:22.192 "num_base_bdevs_discovered": 1, 00:12:22.192 "num_base_bdevs_operational": 1, 00:12:22.192 "base_bdevs_list": [ 00:12:22.192 { 00:12:22.192 "name": null, 00:12:22.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.192 "is_configured": false, 00:12:22.192 "data_offset": 2048, 00:12:22.192 "data_size": 63488 00:12:22.192 }, 00:12:22.192 { 00:12:22.192 "name": "pt2", 00:12:22.192 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:22.192 "is_configured": true, 00:12:22.192 "data_offset": 2048, 00:12:22.192 "data_size": 63488 00:12:22.192 } 00:12:22.192 ] 00:12:22.192 }' 00:12:22.192 13:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.192 13:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.759 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:22.759 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:23.018 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:23.018 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:23.018 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:23.277 [2024-07-15 13:31:02.536572] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' b02a4093-8523-45bb-8758-23f6f3f77971 '!=' b02a4093-8523-45bb-8758-23f6f3f77971 ']' 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2083050 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2083050 ']' 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2083050 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2083050 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2083050' 00:12:23.277 killing process with pid 2083050 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2083050 00:12:23.277 [2024-07-15 13:31:02.605710] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:23.277 [2024-07-15 13:31:02.605765] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:23.277 [2024-07-15 13:31:02.605811] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:23.277 [2024-07-15 13:31:02.605823] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9ed2b0 name raid_bdev1, state offline 00:12:23.277 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2083050 00:12:23.277 [2024-07-15 13:31:02.625128] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:23.536 13:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:23.536 00:12:23.536 real 0m16.032s 00:12:23.536 user 0m29.154s 00:12:23.536 sys 0m2.893s 00:12:23.536 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:23.536 13:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.536 ************************************ 00:12:23.536 END TEST raid_superblock_test 00:12:23.536 ************************************ 00:12:23.536 13:31:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:23.536 13:31:02 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:23.536 13:31:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:23.536 13:31:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:23.536 13:31:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:23.536 ************************************ 00:12:23.536 START TEST raid_read_error_test 00:12:23.536 ************************************ 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:23.536 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RsKrgibGRx 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2085482 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2085482 /var/tmp/spdk-raid.sock 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2085482 ']' 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:23.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:23.537 13:31:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.796 [2024-07-15 13:31:03.000547] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:12:23.796 [2024-07-15 13:31:03.000612] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2085482 ] 00:12:23.796 [2024-07-15 13:31:03.120460] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.053 [2024-07-15 13:31:03.225230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.053 [2024-07-15 13:31:03.285602] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.053 [2024-07-15 13:31:03.285638] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.620 13:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:24.620 13:31:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:24.620 13:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:24.620 13:31:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:24.878 BaseBdev1_malloc 00:12:24.878 13:31:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:25.135 true 00:12:25.135 13:31:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:25.393 [2024-07-15 13:31:04.654166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:25.393 [2024-07-15 13:31:04.654213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:25.393 [2024-07-15 13:31:04.654235] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d5d0d0 00:12:25.393 [2024-07-15 13:31:04.654248] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:25.393 [2024-07-15 13:31:04.656110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:25.393 [2024-07-15 13:31:04.656142] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:25.393 BaseBdev1 00:12:25.393 13:31:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:25.393 13:31:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:25.652 BaseBdev2_malloc 00:12:25.652 13:31:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:25.910 true 00:12:25.910 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:26.169 [2024-07-15 13:31:05.385026] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:26.169 [2024-07-15 13:31:05.385071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.169 [2024-07-15 13:31:05.385093] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d61910 00:12:26.169 [2024-07-15 13:31:05.385106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.169 [2024-07-15 13:31:05.386705] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.169 [2024-07-15 13:31:05.386735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:26.169 BaseBdev2 00:12:26.169 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:26.427 [2024-07-15 13:31:05.629704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.427 [2024-07-15 13:31:05.631087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:26.427 [2024-07-15 13:31:05.631290] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d63320 00:12:26.427 [2024-07-15 13:31:05.631303] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:26.427 [2024-07-15 13:31:05.631500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bcad00 00:12:26.427 [2024-07-15 13:31:05.631653] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d63320 00:12:26.427 [2024-07-15 13:31:05.631663] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d63320 00:12:26.427 [2024-07-15 13:31:05.631777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.427 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.428 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.685 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.685 "name": "raid_bdev1", 00:12:26.685 "uuid": "6117ab58-29ef-41c4-933b-5685929f5b12", 00:12:26.685 "strip_size_kb": 0, 00:12:26.685 "state": "online", 00:12:26.685 "raid_level": "raid1", 00:12:26.685 "superblock": true, 00:12:26.685 "num_base_bdevs": 2, 00:12:26.686 "num_base_bdevs_discovered": 2, 00:12:26.686 "num_base_bdevs_operational": 2, 00:12:26.686 "base_bdevs_list": [ 00:12:26.686 { 00:12:26.686 "name": "BaseBdev1", 00:12:26.686 "uuid": "b9eef12f-609c-5ffb-b59a-f7d7f348b039", 00:12:26.686 "is_configured": true, 00:12:26.686 "data_offset": 2048, 00:12:26.686 "data_size": 63488 00:12:26.686 }, 00:12:26.686 { 00:12:26.686 "name": "BaseBdev2", 00:12:26.686 "uuid": "7c59287a-e089-5e6d-a5b6-6d1451278ecb", 00:12:26.686 "is_configured": true, 00:12:26.686 "data_offset": 2048, 00:12:26.686 "data_size": 63488 00:12:26.686 } 00:12:26.686 ] 00:12:26.686 }' 00:12:26.686 13:31:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.686 13:31:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.251 13:31:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:27.251 13:31:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:27.251 [2024-07-15 13:31:06.600556] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d5ec70 00:12:28.188 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:28.447 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:28.447 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:28.447 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.448 "name": "raid_bdev1", 00:12:28.448 "uuid": "6117ab58-29ef-41c4-933b-5685929f5b12", 00:12:28.448 "strip_size_kb": 0, 00:12:28.448 "state": "online", 00:12:28.448 "raid_level": "raid1", 00:12:28.448 "superblock": true, 00:12:28.448 "num_base_bdevs": 2, 00:12:28.448 "num_base_bdevs_discovered": 2, 00:12:28.448 "num_base_bdevs_operational": 2, 00:12:28.448 "base_bdevs_list": [ 00:12:28.448 { 00:12:28.448 "name": "BaseBdev1", 00:12:28.448 "uuid": "b9eef12f-609c-5ffb-b59a-f7d7f348b039", 00:12:28.448 "is_configured": true, 00:12:28.448 "data_offset": 2048, 00:12:28.448 "data_size": 63488 00:12:28.448 }, 00:12:28.448 { 00:12:28.448 "name": "BaseBdev2", 00:12:28.448 "uuid": "7c59287a-e089-5e6d-a5b6-6d1451278ecb", 00:12:28.448 "is_configured": true, 00:12:28.448 "data_offset": 2048, 00:12:28.448 "data_size": 63488 00:12:28.448 } 00:12:28.448 ] 00:12:28.448 }' 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.448 13:31:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:29.386 [2024-07-15 13:31:08.678103] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:29.386 [2024-07-15 13:31:08.678139] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.386 [2024-07-15 13:31:08.681273] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.386 [2024-07-15 13:31:08.681305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.386 [2024-07-15 13:31:08.681386] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.386 [2024-07-15 13:31:08.681398] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d63320 name raid_bdev1, state offline 00:12:29.386 0 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2085482 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2085482 ']' 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2085482 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2085482 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2085482' 00:12:29.386 killing process with pid 2085482 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2085482 00:12:29.386 [2024-07-15 13:31:08.747141] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:29.386 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2085482 00:12:29.386 [2024-07-15 13:31:08.757987] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:29.645 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RsKrgibGRx 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:29.646 00:12:29.646 real 0m6.069s 00:12:29.646 user 0m9.501s 00:12:29.646 sys 0m1.020s 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:29.646 13:31:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.646 ************************************ 00:12:29.646 END TEST raid_read_error_test 00:12:29.646 ************************************ 00:12:29.646 13:31:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:29.646 13:31:09 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:29.646 13:31:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:29.646 13:31:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:29.646 13:31:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:29.646 ************************************ 00:12:29.646 START TEST raid_write_error_test 00:12:29.646 ************************************ 00:12:29.646 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:29.646 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:29.646 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.J1pUzXPekX 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2086448 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2086448 /var/tmp/spdk-raid.sock 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2086448 ']' 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:29.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:29.905 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.905 [2024-07-15 13:31:09.116941] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:12:29.905 [2024-07-15 13:31:09.116991] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2086448 ] 00:12:29.905 [2024-07-15 13:31:09.228569] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:30.164 [2024-07-15 13:31:09.333242] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:30.164 [2024-07-15 13:31:09.387351] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.164 [2024-07-15 13:31:09.387376] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.424 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:30.424 13:31:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:30.424 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:30.424 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:30.424 BaseBdev1_malloc 00:12:30.683 13:31:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:30.942 true 00:12:30.942 13:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:31.201 [2024-07-15 13:31:10.602567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:31.201 [2024-07-15 13:31:10.602615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:31.201 [2024-07-15 13:31:10.602636] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25130d0 00:12:31.201 [2024-07-15 13:31:10.602649] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:31.201 [2024-07-15 13:31:10.604551] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:31.201 [2024-07-15 13:31:10.604582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:31.201 BaseBdev1 00:12:31.201 13:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:31.201 13:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:31.460 BaseBdev2_malloc 00:12:31.460 13:31:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:32.028 true 00:12:32.028 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:32.288 [2024-07-15 13:31:11.589777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:32.288 [2024-07-15 13:31:11.589821] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:32.288 [2024-07-15 13:31:11.589842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2517910 00:12:32.288 [2024-07-15 13:31:11.589855] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:32.288 [2024-07-15 13:31:11.591438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:32.288 [2024-07-15 13:31:11.591467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:32.288 BaseBdev2 00:12:32.288 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:32.548 [2024-07-15 13:31:11.834445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:32.548 [2024-07-15 13:31:11.835812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:32.548 [2024-07-15 13:31:11.836014] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2519320 00:12:32.548 [2024-07-15 13:31:11.836028] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:32.548 [2024-07-15 13:31:11.836224] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2380d00 00:12:32.548 [2024-07-15 13:31:11.836380] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2519320 00:12:32.548 [2024-07-15 13:31:11.836390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2519320 00:12:32.548 [2024-07-15 13:31:11.836499] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.548 13:31:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:32.808 13:31:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.808 "name": "raid_bdev1", 00:12:32.808 "uuid": "69133442-ee20-4349-a906-5b8246b605c3", 00:12:32.808 "strip_size_kb": 0, 00:12:32.808 "state": "online", 00:12:32.808 "raid_level": "raid1", 00:12:32.808 "superblock": true, 00:12:32.808 "num_base_bdevs": 2, 00:12:32.808 "num_base_bdevs_discovered": 2, 00:12:32.808 "num_base_bdevs_operational": 2, 00:12:32.808 "base_bdevs_list": [ 00:12:32.808 { 00:12:32.808 "name": "BaseBdev1", 00:12:32.808 "uuid": "2073472a-4320-5d8b-bb4c-21c49a4642f5", 00:12:32.808 "is_configured": true, 00:12:32.808 "data_offset": 2048, 00:12:32.808 "data_size": 63488 00:12:32.808 }, 00:12:32.808 { 00:12:32.808 "name": "BaseBdev2", 00:12:32.808 "uuid": "6882569f-9d65-5156-a329-6f0e47dfab02", 00:12:32.808 "is_configured": true, 00:12:32.808 "data_offset": 2048, 00:12:32.808 "data_size": 63488 00:12:32.808 } 00:12:32.808 ] 00:12:32.808 }' 00:12:32.808 13:31:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.808 13:31:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.376 13:31:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:33.376 13:31:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:33.376 [2024-07-15 13:31:12.781332] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2514c70 00:12:34.314 13:31:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:34.882 [2024-07-15 13:31:14.168137] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:12:34.882 [2024-07-15 13:31:14.168186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:34.882 [2024-07-15 13:31:14.168360] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2514c70 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.882 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:35.142 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.142 "name": "raid_bdev1", 00:12:35.142 "uuid": "69133442-ee20-4349-a906-5b8246b605c3", 00:12:35.142 "strip_size_kb": 0, 00:12:35.142 "state": "online", 00:12:35.142 "raid_level": "raid1", 00:12:35.142 "superblock": true, 00:12:35.142 "num_base_bdevs": 2, 00:12:35.142 "num_base_bdevs_discovered": 1, 00:12:35.142 "num_base_bdevs_operational": 1, 00:12:35.142 "base_bdevs_list": [ 00:12:35.142 { 00:12:35.142 "name": null, 00:12:35.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.142 "is_configured": false, 00:12:35.142 "data_offset": 2048, 00:12:35.142 "data_size": 63488 00:12:35.142 }, 00:12:35.142 { 00:12:35.142 "name": "BaseBdev2", 00:12:35.142 "uuid": "6882569f-9d65-5156-a329-6f0e47dfab02", 00:12:35.142 "is_configured": true, 00:12:35.142 "data_offset": 2048, 00:12:35.142 "data_size": 63488 00:12:35.142 } 00:12:35.142 ] 00:12:35.142 }' 00:12:35.142 13:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.142 13:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:35.711 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:35.971 [2024-07-15 13:31:15.268480] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:35.971 [2024-07-15 13:31:15.268520] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:35.971 [2024-07-15 13:31:15.271699] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:35.971 [2024-07-15 13:31:15.271726] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:35.971 [2024-07-15 13:31:15.271778] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:35.971 [2024-07-15 13:31:15.271790] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2519320 name raid_bdev1, state offline 00:12:35.971 0 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2086448 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2086448 ']' 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2086448 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2086448 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2086448' 00:12:35.971 killing process with pid 2086448 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2086448 00:12:35.971 [2024-07-15 13:31:15.331870] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:35.971 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2086448 00:12:35.971 [2024-07-15 13:31:15.341989] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.J1pUzXPekX 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:36.230 00:12:36.230 real 0m6.494s 00:12:36.230 user 0m10.774s 00:12:36.230 sys 0m1.099s 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:36.230 13:31:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.231 ************************************ 00:12:36.231 END TEST raid_write_error_test 00:12:36.231 ************************************ 00:12:36.231 13:31:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:36.231 13:31:15 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:36.231 13:31:15 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:36.231 13:31:15 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:36.231 13:31:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:36.231 13:31:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:36.231 13:31:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:36.231 ************************************ 00:12:36.231 START TEST raid_state_function_test 00:12:36.231 ************************************ 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2087394 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2087394' 00:12:36.231 Process raid pid: 2087394 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2087394 /var/tmp/spdk-raid.sock 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2087394 ']' 00:12:36.231 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:36.490 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:36.490 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:36.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:36.490 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:36.490 13:31:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.490 [2024-07-15 13:31:15.713623] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:12:36.490 [2024-07-15 13:31:15.713688] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:36.490 [2024-07-15 13:31:15.840855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:36.749 [2024-07-15 13:31:15.944763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:36.749 [2024-07-15 13:31:16.004906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:36.749 [2024-07-15 13:31:16.004939] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:37.319 13:31:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:37.319 13:31:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:37.319 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:37.586 [2024-07-15 13:31:16.807161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:37.586 [2024-07-15 13:31:16.807216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:37.586 [2024-07-15 13:31:16.807229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:37.586 [2024-07-15 13:31:16.807241] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:37.586 [2024-07-15 13:31:16.807250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:37.586 [2024-07-15 13:31:16.807261] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:37.586 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.587 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.587 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.587 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.587 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.587 13:31:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.846 13:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.846 "name": "Existed_Raid", 00:12:37.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.846 "strip_size_kb": 64, 00:12:37.846 "state": "configuring", 00:12:37.846 "raid_level": "raid0", 00:12:37.846 "superblock": false, 00:12:37.846 "num_base_bdevs": 3, 00:12:37.846 "num_base_bdevs_discovered": 0, 00:12:37.846 "num_base_bdevs_operational": 3, 00:12:37.846 "base_bdevs_list": [ 00:12:37.846 { 00:12:37.846 "name": "BaseBdev1", 00:12:37.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.846 "is_configured": false, 00:12:37.846 "data_offset": 0, 00:12:37.846 "data_size": 0 00:12:37.846 }, 00:12:37.846 { 00:12:37.846 "name": "BaseBdev2", 00:12:37.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.846 "is_configured": false, 00:12:37.846 "data_offset": 0, 00:12:37.846 "data_size": 0 00:12:37.846 }, 00:12:37.846 { 00:12:37.846 "name": "BaseBdev3", 00:12:37.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.846 "is_configured": false, 00:12:37.846 "data_offset": 0, 00:12:37.846 "data_size": 0 00:12:37.846 } 00:12:37.846 ] 00:12:37.846 }' 00:12:37.846 13:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.846 13:31:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.413 13:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:38.671 [2024-07-15 13:31:17.901898] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:38.671 [2024-07-15 13:31:17.901935] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215aa80 name Existed_Raid, state configuring 00:12:38.671 13:31:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:38.929 [2024-07-15 13:31:18.154576] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:38.929 [2024-07-15 13:31:18.154605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:38.929 [2024-07-15 13:31:18.154614] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:38.929 [2024-07-15 13:31:18.154626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:38.929 [2024-07-15 13:31:18.154635] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:38.929 [2024-07-15 13:31:18.154646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:38.929 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:39.189 [2024-07-15 13:31:18.434688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:39.189 BaseBdev1 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:39.189 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:39.448 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:39.707 [ 00:12:39.707 { 00:12:39.707 "name": "BaseBdev1", 00:12:39.707 "aliases": [ 00:12:39.707 "3e9a8a50-c554-4b42-a452-a88036107f50" 00:12:39.707 ], 00:12:39.707 "product_name": "Malloc disk", 00:12:39.707 "block_size": 512, 00:12:39.707 "num_blocks": 65536, 00:12:39.707 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:39.707 "assigned_rate_limits": { 00:12:39.707 "rw_ios_per_sec": 0, 00:12:39.707 "rw_mbytes_per_sec": 0, 00:12:39.707 "r_mbytes_per_sec": 0, 00:12:39.707 "w_mbytes_per_sec": 0 00:12:39.707 }, 00:12:39.707 "claimed": true, 00:12:39.707 "claim_type": "exclusive_write", 00:12:39.707 "zoned": false, 00:12:39.707 "supported_io_types": { 00:12:39.707 "read": true, 00:12:39.707 "write": true, 00:12:39.707 "unmap": true, 00:12:39.707 "flush": true, 00:12:39.707 "reset": true, 00:12:39.707 "nvme_admin": false, 00:12:39.707 "nvme_io": false, 00:12:39.707 "nvme_io_md": false, 00:12:39.707 "write_zeroes": true, 00:12:39.707 "zcopy": true, 00:12:39.707 "get_zone_info": false, 00:12:39.707 "zone_management": false, 00:12:39.707 "zone_append": false, 00:12:39.707 "compare": false, 00:12:39.707 "compare_and_write": false, 00:12:39.707 "abort": true, 00:12:39.707 "seek_hole": false, 00:12:39.707 "seek_data": false, 00:12:39.707 "copy": true, 00:12:39.707 "nvme_iov_md": false 00:12:39.707 }, 00:12:39.707 "memory_domains": [ 00:12:39.707 { 00:12:39.707 "dma_device_id": "system", 00:12:39.707 "dma_device_type": 1 00:12:39.707 }, 00:12:39.707 { 00:12:39.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.707 "dma_device_type": 2 00:12:39.707 } 00:12:39.707 ], 00:12:39.707 "driver_specific": {} 00:12:39.707 } 00:12:39.707 ] 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.707 13:31:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.966 13:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.966 "name": "Existed_Raid", 00:12:39.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.966 "strip_size_kb": 64, 00:12:39.966 "state": "configuring", 00:12:39.966 "raid_level": "raid0", 00:12:39.966 "superblock": false, 00:12:39.966 "num_base_bdevs": 3, 00:12:39.966 "num_base_bdevs_discovered": 1, 00:12:39.966 "num_base_bdevs_operational": 3, 00:12:39.966 "base_bdevs_list": [ 00:12:39.966 { 00:12:39.966 "name": "BaseBdev1", 00:12:39.966 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:39.966 "is_configured": true, 00:12:39.966 "data_offset": 0, 00:12:39.966 "data_size": 65536 00:12:39.966 }, 00:12:39.966 { 00:12:39.966 "name": "BaseBdev2", 00:12:39.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.966 "is_configured": false, 00:12:39.967 "data_offset": 0, 00:12:39.967 "data_size": 0 00:12:39.967 }, 00:12:39.967 { 00:12:39.967 "name": "BaseBdev3", 00:12:39.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.967 "is_configured": false, 00:12:39.967 "data_offset": 0, 00:12:39.967 "data_size": 0 00:12:39.967 } 00:12:39.967 ] 00:12:39.967 }' 00:12:39.967 13:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.967 13:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.534 13:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:40.792 [2024-07-15 13:31:20.042966] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:40.792 [2024-07-15 13:31:20.043012] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215a310 name Existed_Raid, state configuring 00:12:40.792 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:41.052 [2024-07-15 13:31:20.299676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:41.052 [2024-07-15 13:31:20.301151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:41.052 [2024-07-15 13:31:20.301187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:41.052 [2024-07-15 13:31:20.301197] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:41.052 [2024-07-15 13:31:20.301208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.052 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.311 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.311 "name": "Existed_Raid", 00:12:41.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.311 "strip_size_kb": 64, 00:12:41.311 "state": "configuring", 00:12:41.311 "raid_level": "raid0", 00:12:41.311 "superblock": false, 00:12:41.311 "num_base_bdevs": 3, 00:12:41.311 "num_base_bdevs_discovered": 1, 00:12:41.311 "num_base_bdevs_operational": 3, 00:12:41.311 "base_bdevs_list": [ 00:12:41.311 { 00:12:41.311 "name": "BaseBdev1", 00:12:41.311 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:41.311 "is_configured": true, 00:12:41.311 "data_offset": 0, 00:12:41.311 "data_size": 65536 00:12:41.311 }, 00:12:41.311 { 00:12:41.311 "name": "BaseBdev2", 00:12:41.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.311 "is_configured": false, 00:12:41.311 "data_offset": 0, 00:12:41.311 "data_size": 0 00:12:41.311 }, 00:12:41.311 { 00:12:41.311 "name": "BaseBdev3", 00:12:41.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.311 "is_configured": false, 00:12:41.311 "data_offset": 0, 00:12:41.311 "data_size": 0 00:12:41.311 } 00:12:41.311 ] 00:12:41.311 }' 00:12:41.311 13:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.311 13:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.879 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:42.138 [2024-07-15 13:31:21.355434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:42.138 BaseBdev2 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.138 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:42.396 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:42.655 [ 00:12:42.655 { 00:12:42.655 "name": "BaseBdev2", 00:12:42.655 "aliases": [ 00:12:42.655 "3110c684-8795-4493-a67d-0953e6a2db59" 00:12:42.655 ], 00:12:42.655 "product_name": "Malloc disk", 00:12:42.655 "block_size": 512, 00:12:42.655 "num_blocks": 65536, 00:12:42.655 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:42.655 "assigned_rate_limits": { 00:12:42.655 "rw_ios_per_sec": 0, 00:12:42.655 "rw_mbytes_per_sec": 0, 00:12:42.655 "r_mbytes_per_sec": 0, 00:12:42.655 "w_mbytes_per_sec": 0 00:12:42.655 }, 00:12:42.655 "claimed": true, 00:12:42.655 "claim_type": "exclusive_write", 00:12:42.655 "zoned": false, 00:12:42.655 "supported_io_types": { 00:12:42.655 "read": true, 00:12:42.655 "write": true, 00:12:42.655 "unmap": true, 00:12:42.655 "flush": true, 00:12:42.655 "reset": true, 00:12:42.655 "nvme_admin": false, 00:12:42.655 "nvme_io": false, 00:12:42.655 "nvme_io_md": false, 00:12:42.655 "write_zeroes": true, 00:12:42.655 "zcopy": true, 00:12:42.655 "get_zone_info": false, 00:12:42.655 "zone_management": false, 00:12:42.655 "zone_append": false, 00:12:42.655 "compare": false, 00:12:42.655 "compare_and_write": false, 00:12:42.655 "abort": true, 00:12:42.655 "seek_hole": false, 00:12:42.655 "seek_data": false, 00:12:42.655 "copy": true, 00:12:42.656 "nvme_iov_md": false 00:12:42.656 }, 00:12:42.656 "memory_domains": [ 00:12:42.656 { 00:12:42.656 "dma_device_id": "system", 00:12:42.656 "dma_device_type": 1 00:12:42.656 }, 00:12:42.656 { 00:12:42.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.656 "dma_device_type": 2 00:12:42.656 } 00:12:42.656 ], 00:12:42.656 "driver_specific": {} 00:12:42.656 } 00:12:42.656 ] 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.656 13:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.915 13:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.915 "name": "Existed_Raid", 00:12:42.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.915 "strip_size_kb": 64, 00:12:42.915 "state": "configuring", 00:12:42.915 "raid_level": "raid0", 00:12:42.915 "superblock": false, 00:12:42.915 "num_base_bdevs": 3, 00:12:42.915 "num_base_bdevs_discovered": 2, 00:12:42.915 "num_base_bdevs_operational": 3, 00:12:42.915 "base_bdevs_list": [ 00:12:42.915 { 00:12:42.915 "name": "BaseBdev1", 00:12:42.915 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:42.915 "is_configured": true, 00:12:42.915 "data_offset": 0, 00:12:42.915 "data_size": 65536 00:12:42.915 }, 00:12:42.915 { 00:12:42.915 "name": "BaseBdev2", 00:12:42.915 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:42.915 "is_configured": true, 00:12:42.915 "data_offset": 0, 00:12:42.915 "data_size": 65536 00:12:42.915 }, 00:12:42.915 { 00:12:42.915 "name": "BaseBdev3", 00:12:42.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.915 "is_configured": false, 00:12:42.915 "data_offset": 0, 00:12:42.915 "data_size": 0 00:12:42.915 } 00:12:42.915 ] 00:12:42.915 }' 00:12:42.915 13:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.916 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.484 13:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:43.484 [2024-07-15 13:31:22.840295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:43.484 [2024-07-15 13:31:22.840335] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x215b400 00:12:43.485 [2024-07-15 13:31:22.840344] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:43.485 [2024-07-15 13:31:22.840589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215aef0 00:12:43.485 [2024-07-15 13:31:22.840714] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x215b400 00:12:43.485 [2024-07-15 13:31:22.840724] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x215b400 00:12:43.485 [2024-07-15 13:31:22.840885] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:43.485 BaseBdev3 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:43.485 13:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:43.744 13:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:44.010 [ 00:12:44.010 { 00:12:44.010 "name": "BaseBdev3", 00:12:44.010 "aliases": [ 00:12:44.010 "49da6b39-5628-4e18-bdec-5527bb86000b" 00:12:44.010 ], 00:12:44.010 "product_name": "Malloc disk", 00:12:44.010 "block_size": 512, 00:12:44.010 "num_blocks": 65536, 00:12:44.010 "uuid": "49da6b39-5628-4e18-bdec-5527bb86000b", 00:12:44.010 "assigned_rate_limits": { 00:12:44.010 "rw_ios_per_sec": 0, 00:12:44.010 "rw_mbytes_per_sec": 0, 00:12:44.010 "r_mbytes_per_sec": 0, 00:12:44.010 "w_mbytes_per_sec": 0 00:12:44.010 }, 00:12:44.010 "claimed": true, 00:12:44.010 "claim_type": "exclusive_write", 00:12:44.010 "zoned": false, 00:12:44.010 "supported_io_types": { 00:12:44.010 "read": true, 00:12:44.010 "write": true, 00:12:44.010 "unmap": true, 00:12:44.010 "flush": true, 00:12:44.010 "reset": true, 00:12:44.010 "nvme_admin": false, 00:12:44.010 "nvme_io": false, 00:12:44.010 "nvme_io_md": false, 00:12:44.010 "write_zeroes": true, 00:12:44.010 "zcopy": true, 00:12:44.010 "get_zone_info": false, 00:12:44.010 "zone_management": false, 00:12:44.010 "zone_append": false, 00:12:44.010 "compare": false, 00:12:44.010 "compare_and_write": false, 00:12:44.010 "abort": true, 00:12:44.010 "seek_hole": false, 00:12:44.010 "seek_data": false, 00:12:44.010 "copy": true, 00:12:44.010 "nvme_iov_md": false 00:12:44.010 }, 00:12:44.010 "memory_domains": [ 00:12:44.010 { 00:12:44.010 "dma_device_id": "system", 00:12:44.010 "dma_device_type": 1 00:12:44.010 }, 00:12:44.010 { 00:12:44.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.010 "dma_device_type": 2 00:12:44.010 } 00:12:44.010 ], 00:12:44.010 "driver_specific": {} 00:12:44.010 } 00:12:44.010 ] 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.010 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.272 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.272 "name": "Existed_Raid", 00:12:44.272 "uuid": "7c77a354-5a66-48f5-878e-c9216e97ec31", 00:12:44.272 "strip_size_kb": 64, 00:12:44.272 "state": "online", 00:12:44.272 "raid_level": "raid0", 00:12:44.272 "superblock": false, 00:12:44.272 "num_base_bdevs": 3, 00:12:44.272 "num_base_bdevs_discovered": 3, 00:12:44.272 "num_base_bdevs_operational": 3, 00:12:44.272 "base_bdevs_list": [ 00:12:44.272 { 00:12:44.272 "name": "BaseBdev1", 00:12:44.272 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:44.272 "is_configured": true, 00:12:44.272 "data_offset": 0, 00:12:44.272 "data_size": 65536 00:12:44.272 }, 00:12:44.272 { 00:12:44.272 "name": "BaseBdev2", 00:12:44.272 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:44.272 "is_configured": true, 00:12:44.272 "data_offset": 0, 00:12:44.272 "data_size": 65536 00:12:44.272 }, 00:12:44.272 { 00:12:44.272 "name": "BaseBdev3", 00:12:44.272 "uuid": "49da6b39-5628-4e18-bdec-5527bb86000b", 00:12:44.272 "is_configured": true, 00:12:44.272 "data_offset": 0, 00:12:44.272 "data_size": 65536 00:12:44.272 } 00:12:44.272 ] 00:12:44.272 }' 00:12:44.272 13:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.272 13:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:44.840 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:45.128 [2024-07-15 13:31:24.276411] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:45.128 "name": "Existed_Raid", 00:12:45.128 "aliases": [ 00:12:45.128 "7c77a354-5a66-48f5-878e-c9216e97ec31" 00:12:45.128 ], 00:12:45.128 "product_name": "Raid Volume", 00:12:45.128 "block_size": 512, 00:12:45.128 "num_blocks": 196608, 00:12:45.128 "uuid": "7c77a354-5a66-48f5-878e-c9216e97ec31", 00:12:45.128 "assigned_rate_limits": { 00:12:45.128 "rw_ios_per_sec": 0, 00:12:45.128 "rw_mbytes_per_sec": 0, 00:12:45.128 "r_mbytes_per_sec": 0, 00:12:45.128 "w_mbytes_per_sec": 0 00:12:45.128 }, 00:12:45.128 "claimed": false, 00:12:45.128 "zoned": false, 00:12:45.128 "supported_io_types": { 00:12:45.128 "read": true, 00:12:45.128 "write": true, 00:12:45.128 "unmap": true, 00:12:45.128 "flush": true, 00:12:45.128 "reset": true, 00:12:45.128 "nvme_admin": false, 00:12:45.128 "nvme_io": false, 00:12:45.128 "nvme_io_md": false, 00:12:45.128 "write_zeroes": true, 00:12:45.128 "zcopy": false, 00:12:45.128 "get_zone_info": false, 00:12:45.128 "zone_management": false, 00:12:45.128 "zone_append": false, 00:12:45.128 "compare": false, 00:12:45.128 "compare_and_write": false, 00:12:45.128 "abort": false, 00:12:45.128 "seek_hole": false, 00:12:45.128 "seek_data": false, 00:12:45.128 "copy": false, 00:12:45.128 "nvme_iov_md": false 00:12:45.128 }, 00:12:45.128 "memory_domains": [ 00:12:45.128 { 00:12:45.128 "dma_device_id": "system", 00:12:45.128 "dma_device_type": 1 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.128 "dma_device_type": 2 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "dma_device_id": "system", 00:12:45.128 "dma_device_type": 1 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.128 "dma_device_type": 2 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "dma_device_id": "system", 00:12:45.128 "dma_device_type": 1 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.128 "dma_device_type": 2 00:12:45.128 } 00:12:45.128 ], 00:12:45.128 "driver_specific": { 00:12:45.128 "raid": { 00:12:45.128 "uuid": "7c77a354-5a66-48f5-878e-c9216e97ec31", 00:12:45.128 "strip_size_kb": 64, 00:12:45.128 "state": "online", 00:12:45.128 "raid_level": "raid0", 00:12:45.128 "superblock": false, 00:12:45.128 "num_base_bdevs": 3, 00:12:45.128 "num_base_bdevs_discovered": 3, 00:12:45.128 "num_base_bdevs_operational": 3, 00:12:45.128 "base_bdevs_list": [ 00:12:45.128 { 00:12:45.128 "name": "BaseBdev1", 00:12:45.128 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:45.128 "is_configured": true, 00:12:45.128 "data_offset": 0, 00:12:45.128 "data_size": 65536 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "name": "BaseBdev2", 00:12:45.128 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:45.128 "is_configured": true, 00:12:45.128 "data_offset": 0, 00:12:45.128 "data_size": 65536 00:12:45.128 }, 00:12:45.128 { 00:12:45.128 "name": "BaseBdev3", 00:12:45.128 "uuid": "49da6b39-5628-4e18-bdec-5527bb86000b", 00:12:45.128 "is_configured": true, 00:12:45.128 "data_offset": 0, 00:12:45.128 "data_size": 65536 00:12:45.128 } 00:12:45.128 ] 00:12:45.128 } 00:12:45.128 } 00:12:45.128 }' 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:45.128 BaseBdev2 00:12:45.128 BaseBdev3' 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:45.128 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:45.388 "name": "BaseBdev1", 00:12:45.388 "aliases": [ 00:12:45.388 "3e9a8a50-c554-4b42-a452-a88036107f50" 00:12:45.388 ], 00:12:45.388 "product_name": "Malloc disk", 00:12:45.388 "block_size": 512, 00:12:45.388 "num_blocks": 65536, 00:12:45.388 "uuid": "3e9a8a50-c554-4b42-a452-a88036107f50", 00:12:45.388 "assigned_rate_limits": { 00:12:45.388 "rw_ios_per_sec": 0, 00:12:45.388 "rw_mbytes_per_sec": 0, 00:12:45.388 "r_mbytes_per_sec": 0, 00:12:45.388 "w_mbytes_per_sec": 0 00:12:45.388 }, 00:12:45.388 "claimed": true, 00:12:45.388 "claim_type": "exclusive_write", 00:12:45.388 "zoned": false, 00:12:45.388 "supported_io_types": { 00:12:45.388 "read": true, 00:12:45.388 "write": true, 00:12:45.388 "unmap": true, 00:12:45.388 "flush": true, 00:12:45.388 "reset": true, 00:12:45.388 "nvme_admin": false, 00:12:45.388 "nvme_io": false, 00:12:45.388 "nvme_io_md": false, 00:12:45.388 "write_zeroes": true, 00:12:45.388 "zcopy": true, 00:12:45.388 "get_zone_info": false, 00:12:45.388 "zone_management": false, 00:12:45.388 "zone_append": false, 00:12:45.388 "compare": false, 00:12:45.388 "compare_and_write": false, 00:12:45.388 "abort": true, 00:12:45.388 "seek_hole": false, 00:12:45.388 "seek_data": false, 00:12:45.388 "copy": true, 00:12:45.388 "nvme_iov_md": false 00:12:45.388 }, 00:12:45.388 "memory_domains": [ 00:12:45.388 { 00:12:45.388 "dma_device_id": "system", 00:12:45.388 "dma_device_type": 1 00:12:45.388 }, 00:12:45.388 { 00:12:45.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.388 "dma_device_type": 2 00:12:45.388 } 00:12:45.388 ], 00:12:45.388 "driver_specific": {} 00:12:45.388 }' 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:45.388 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:45.648 13:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:45.908 "name": "BaseBdev2", 00:12:45.908 "aliases": [ 00:12:45.908 "3110c684-8795-4493-a67d-0953e6a2db59" 00:12:45.908 ], 00:12:45.908 "product_name": "Malloc disk", 00:12:45.908 "block_size": 512, 00:12:45.908 "num_blocks": 65536, 00:12:45.908 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:45.908 "assigned_rate_limits": { 00:12:45.908 "rw_ios_per_sec": 0, 00:12:45.908 "rw_mbytes_per_sec": 0, 00:12:45.908 "r_mbytes_per_sec": 0, 00:12:45.908 "w_mbytes_per_sec": 0 00:12:45.908 }, 00:12:45.908 "claimed": true, 00:12:45.908 "claim_type": "exclusive_write", 00:12:45.908 "zoned": false, 00:12:45.908 "supported_io_types": { 00:12:45.908 "read": true, 00:12:45.908 "write": true, 00:12:45.908 "unmap": true, 00:12:45.908 "flush": true, 00:12:45.908 "reset": true, 00:12:45.908 "nvme_admin": false, 00:12:45.908 "nvme_io": false, 00:12:45.908 "nvme_io_md": false, 00:12:45.908 "write_zeroes": true, 00:12:45.908 "zcopy": true, 00:12:45.908 "get_zone_info": false, 00:12:45.908 "zone_management": false, 00:12:45.908 "zone_append": false, 00:12:45.908 "compare": false, 00:12:45.908 "compare_and_write": false, 00:12:45.908 "abort": true, 00:12:45.908 "seek_hole": false, 00:12:45.908 "seek_data": false, 00:12:45.908 "copy": true, 00:12:45.908 "nvme_iov_md": false 00:12:45.908 }, 00:12:45.908 "memory_domains": [ 00:12:45.908 { 00:12:45.908 "dma_device_id": "system", 00:12:45.908 "dma_device_type": 1 00:12:45.908 }, 00:12:45.908 { 00:12:45.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.908 "dma_device_type": 2 00:12:45.908 } 00:12:45.908 ], 00:12:45.908 "driver_specific": {} 00:12:45.908 }' 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.908 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:46.167 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:46.426 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:46.426 "name": "BaseBdev3", 00:12:46.426 "aliases": [ 00:12:46.426 "49da6b39-5628-4e18-bdec-5527bb86000b" 00:12:46.426 ], 00:12:46.426 "product_name": "Malloc disk", 00:12:46.426 "block_size": 512, 00:12:46.426 "num_blocks": 65536, 00:12:46.426 "uuid": "49da6b39-5628-4e18-bdec-5527bb86000b", 00:12:46.426 "assigned_rate_limits": { 00:12:46.426 "rw_ios_per_sec": 0, 00:12:46.426 "rw_mbytes_per_sec": 0, 00:12:46.426 "r_mbytes_per_sec": 0, 00:12:46.426 "w_mbytes_per_sec": 0 00:12:46.426 }, 00:12:46.426 "claimed": true, 00:12:46.426 "claim_type": "exclusive_write", 00:12:46.426 "zoned": false, 00:12:46.426 "supported_io_types": { 00:12:46.426 "read": true, 00:12:46.426 "write": true, 00:12:46.426 "unmap": true, 00:12:46.426 "flush": true, 00:12:46.426 "reset": true, 00:12:46.426 "nvme_admin": false, 00:12:46.426 "nvme_io": false, 00:12:46.426 "nvme_io_md": false, 00:12:46.426 "write_zeroes": true, 00:12:46.426 "zcopy": true, 00:12:46.426 "get_zone_info": false, 00:12:46.426 "zone_management": false, 00:12:46.426 "zone_append": false, 00:12:46.426 "compare": false, 00:12:46.426 "compare_and_write": false, 00:12:46.426 "abort": true, 00:12:46.426 "seek_hole": false, 00:12:46.426 "seek_data": false, 00:12:46.426 "copy": true, 00:12:46.426 "nvme_iov_md": false 00:12:46.426 }, 00:12:46.426 "memory_domains": [ 00:12:46.426 { 00:12:46.426 "dma_device_id": "system", 00:12:46.426 "dma_device_type": 1 00:12:46.426 }, 00:12:46.426 { 00:12:46.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.426 "dma_device_type": 2 00:12:46.426 } 00:12:46.426 ], 00:12:46.426 "driver_specific": {} 00:12:46.426 }' 00:12:46.426 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.426 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.426 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:46.426 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.685 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.685 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:46.685 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:46.685 13:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:46.685 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:46.685 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:46.685 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:46.685 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:46.685 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:46.945 [2024-07-15 13:31:26.257463] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:46.945 [2024-07-15 13:31:26.257495] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:46.945 [2024-07-15 13:31:26.257537] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.945 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.204 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.204 "name": "Existed_Raid", 00:12:47.204 "uuid": "7c77a354-5a66-48f5-878e-c9216e97ec31", 00:12:47.204 "strip_size_kb": 64, 00:12:47.204 "state": "offline", 00:12:47.204 "raid_level": "raid0", 00:12:47.204 "superblock": false, 00:12:47.204 "num_base_bdevs": 3, 00:12:47.204 "num_base_bdevs_discovered": 2, 00:12:47.204 "num_base_bdevs_operational": 2, 00:12:47.204 "base_bdevs_list": [ 00:12:47.204 { 00:12:47.204 "name": null, 00:12:47.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.204 "is_configured": false, 00:12:47.204 "data_offset": 0, 00:12:47.204 "data_size": 65536 00:12:47.204 }, 00:12:47.204 { 00:12:47.204 "name": "BaseBdev2", 00:12:47.204 "uuid": "3110c684-8795-4493-a67d-0953e6a2db59", 00:12:47.204 "is_configured": true, 00:12:47.204 "data_offset": 0, 00:12:47.204 "data_size": 65536 00:12:47.204 }, 00:12:47.204 { 00:12:47.204 "name": "BaseBdev3", 00:12:47.204 "uuid": "49da6b39-5628-4e18-bdec-5527bb86000b", 00:12:47.204 "is_configured": true, 00:12:47.204 "data_offset": 0, 00:12:47.204 "data_size": 65536 00:12:47.204 } 00:12:47.204 ] 00:12:47.204 }' 00:12:47.204 13:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.204 13:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.773 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:47.773 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:47.773 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.773 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:48.032 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:48.032 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:48.032 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:48.292 [2024-07-15 13:31:27.593223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:48.292 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:48.292 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:48.292 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.292 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:48.552 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:48.552 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:48.552 13:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:48.812 [2024-07-15 13:31:28.126225] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:48.812 [2024-07-15 13:31:28.126274] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215b400 name Existed_Raid, state offline 00:12:48.812 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:48.812 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:48.812 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.812 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:49.071 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:49.330 BaseBdev2 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:49.330 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:49.590 13:31:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:49.849 [ 00:12:49.849 { 00:12:49.849 "name": "BaseBdev2", 00:12:49.849 "aliases": [ 00:12:49.849 "d675fd08-35bd-4f01-ba10-e161a01650db" 00:12:49.849 ], 00:12:49.849 "product_name": "Malloc disk", 00:12:49.849 "block_size": 512, 00:12:49.849 "num_blocks": 65536, 00:12:49.849 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:49.849 "assigned_rate_limits": { 00:12:49.849 "rw_ios_per_sec": 0, 00:12:49.849 "rw_mbytes_per_sec": 0, 00:12:49.849 "r_mbytes_per_sec": 0, 00:12:49.849 "w_mbytes_per_sec": 0 00:12:49.849 }, 00:12:49.849 "claimed": false, 00:12:49.849 "zoned": false, 00:12:49.849 "supported_io_types": { 00:12:49.849 "read": true, 00:12:49.849 "write": true, 00:12:49.849 "unmap": true, 00:12:49.849 "flush": true, 00:12:49.849 "reset": true, 00:12:49.849 "nvme_admin": false, 00:12:49.849 "nvme_io": false, 00:12:49.849 "nvme_io_md": false, 00:12:49.849 "write_zeroes": true, 00:12:49.849 "zcopy": true, 00:12:49.849 "get_zone_info": false, 00:12:49.849 "zone_management": false, 00:12:49.849 "zone_append": false, 00:12:49.849 "compare": false, 00:12:49.849 "compare_and_write": false, 00:12:49.849 "abort": true, 00:12:49.849 "seek_hole": false, 00:12:49.849 "seek_data": false, 00:12:49.849 "copy": true, 00:12:49.849 "nvme_iov_md": false 00:12:49.849 }, 00:12:49.849 "memory_domains": [ 00:12:49.850 { 00:12:49.850 "dma_device_id": "system", 00:12:49.850 "dma_device_type": 1 00:12:49.850 }, 00:12:49.850 { 00:12:49.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.850 "dma_device_type": 2 00:12:49.850 } 00:12:49.850 ], 00:12:49.850 "driver_specific": {} 00:12:49.850 } 00:12:49.850 ] 00:12:49.850 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:49.850 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:49.850 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:49.850 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:50.109 BaseBdev3 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.109 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:50.368 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:50.627 [ 00:12:50.627 { 00:12:50.627 "name": "BaseBdev3", 00:12:50.627 "aliases": [ 00:12:50.627 "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3" 00:12:50.627 ], 00:12:50.627 "product_name": "Malloc disk", 00:12:50.627 "block_size": 512, 00:12:50.627 "num_blocks": 65536, 00:12:50.627 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:50.627 "assigned_rate_limits": { 00:12:50.627 "rw_ios_per_sec": 0, 00:12:50.627 "rw_mbytes_per_sec": 0, 00:12:50.627 "r_mbytes_per_sec": 0, 00:12:50.627 "w_mbytes_per_sec": 0 00:12:50.627 }, 00:12:50.627 "claimed": false, 00:12:50.627 "zoned": false, 00:12:50.627 "supported_io_types": { 00:12:50.627 "read": true, 00:12:50.627 "write": true, 00:12:50.627 "unmap": true, 00:12:50.627 "flush": true, 00:12:50.627 "reset": true, 00:12:50.627 "nvme_admin": false, 00:12:50.627 "nvme_io": false, 00:12:50.627 "nvme_io_md": false, 00:12:50.627 "write_zeroes": true, 00:12:50.627 "zcopy": true, 00:12:50.627 "get_zone_info": false, 00:12:50.627 "zone_management": false, 00:12:50.627 "zone_append": false, 00:12:50.627 "compare": false, 00:12:50.627 "compare_and_write": false, 00:12:50.627 "abort": true, 00:12:50.627 "seek_hole": false, 00:12:50.627 "seek_data": false, 00:12:50.627 "copy": true, 00:12:50.627 "nvme_iov_md": false 00:12:50.627 }, 00:12:50.627 "memory_domains": [ 00:12:50.627 { 00:12:50.627 "dma_device_id": "system", 00:12:50.627 "dma_device_type": 1 00:12:50.627 }, 00:12:50.627 { 00:12:50.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.627 "dma_device_type": 2 00:12:50.627 } 00:12:50.627 ], 00:12:50.628 "driver_specific": {} 00:12:50.628 } 00:12:50.628 ] 00:12:50.628 13:31:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:50.628 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:50.628 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:50.628 13:31:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:50.886 [2024-07-15 13:31:30.094224] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:50.886 [2024-07-15 13:31:30.094269] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:50.886 [2024-07-15 13:31:30.094291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:50.886 [2024-07-15 13:31:30.095639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:50.886 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.887 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.146 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.146 "name": "Existed_Raid", 00:12:51.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.146 "strip_size_kb": 64, 00:12:51.146 "state": "configuring", 00:12:51.146 "raid_level": "raid0", 00:12:51.146 "superblock": false, 00:12:51.146 "num_base_bdevs": 3, 00:12:51.146 "num_base_bdevs_discovered": 2, 00:12:51.146 "num_base_bdevs_operational": 3, 00:12:51.146 "base_bdevs_list": [ 00:12:51.146 { 00:12:51.146 "name": "BaseBdev1", 00:12:51.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.146 "is_configured": false, 00:12:51.146 "data_offset": 0, 00:12:51.146 "data_size": 0 00:12:51.146 }, 00:12:51.146 { 00:12:51.146 "name": "BaseBdev2", 00:12:51.146 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:51.146 "is_configured": true, 00:12:51.146 "data_offset": 0, 00:12:51.146 "data_size": 65536 00:12:51.146 }, 00:12:51.146 { 00:12:51.146 "name": "BaseBdev3", 00:12:51.146 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:51.146 "is_configured": true, 00:12:51.146 "data_offset": 0, 00:12:51.146 "data_size": 65536 00:12:51.146 } 00:12:51.146 ] 00:12:51.146 }' 00:12:51.146 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.146 13:31:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.712 13:31:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:51.712 [2024-07-15 13:31:31.116890] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.970 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.971 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.971 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.971 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.230 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.230 "name": "Existed_Raid", 00:12:52.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.230 "strip_size_kb": 64, 00:12:52.230 "state": "configuring", 00:12:52.230 "raid_level": "raid0", 00:12:52.230 "superblock": false, 00:12:52.230 "num_base_bdevs": 3, 00:12:52.230 "num_base_bdevs_discovered": 1, 00:12:52.230 "num_base_bdevs_operational": 3, 00:12:52.230 "base_bdevs_list": [ 00:12:52.230 { 00:12:52.230 "name": "BaseBdev1", 00:12:52.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.230 "is_configured": false, 00:12:52.230 "data_offset": 0, 00:12:52.230 "data_size": 0 00:12:52.230 }, 00:12:52.230 { 00:12:52.230 "name": null, 00:12:52.230 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:52.230 "is_configured": false, 00:12:52.230 "data_offset": 0, 00:12:52.230 "data_size": 65536 00:12:52.230 }, 00:12:52.230 { 00:12:52.230 "name": "BaseBdev3", 00:12:52.230 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:52.230 "is_configured": true, 00:12:52.230 "data_offset": 0, 00:12:52.230 "data_size": 65536 00:12:52.230 } 00:12:52.230 ] 00:12:52.230 }' 00:12:52.230 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.230 13:31:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.797 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.797 13:31:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:53.055 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:53.055 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:53.314 [2024-07-15 13:31:32.489354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:53.314 BaseBdev1 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:53.314 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:53.573 [ 00:12:53.573 { 00:12:53.573 "name": "BaseBdev1", 00:12:53.573 "aliases": [ 00:12:53.573 "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c" 00:12:53.573 ], 00:12:53.573 "product_name": "Malloc disk", 00:12:53.573 "block_size": 512, 00:12:53.573 "num_blocks": 65536, 00:12:53.573 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:53.573 "assigned_rate_limits": { 00:12:53.573 "rw_ios_per_sec": 0, 00:12:53.573 "rw_mbytes_per_sec": 0, 00:12:53.573 "r_mbytes_per_sec": 0, 00:12:53.573 "w_mbytes_per_sec": 0 00:12:53.573 }, 00:12:53.573 "claimed": true, 00:12:53.573 "claim_type": "exclusive_write", 00:12:53.573 "zoned": false, 00:12:53.573 "supported_io_types": { 00:12:53.573 "read": true, 00:12:53.573 "write": true, 00:12:53.573 "unmap": true, 00:12:53.573 "flush": true, 00:12:53.573 "reset": true, 00:12:53.573 "nvme_admin": false, 00:12:53.573 "nvme_io": false, 00:12:53.573 "nvme_io_md": false, 00:12:53.573 "write_zeroes": true, 00:12:53.573 "zcopy": true, 00:12:53.573 "get_zone_info": false, 00:12:53.573 "zone_management": false, 00:12:53.573 "zone_append": false, 00:12:53.573 "compare": false, 00:12:53.573 "compare_and_write": false, 00:12:53.573 "abort": true, 00:12:53.573 "seek_hole": false, 00:12:53.573 "seek_data": false, 00:12:53.573 "copy": true, 00:12:53.573 "nvme_iov_md": false 00:12:53.573 }, 00:12:53.573 "memory_domains": [ 00:12:53.573 { 00:12:53.573 "dma_device_id": "system", 00:12:53.573 "dma_device_type": 1 00:12:53.573 }, 00:12:53.573 { 00:12:53.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.573 "dma_device_type": 2 00:12:53.573 } 00:12:53.573 ], 00:12:53.573 "driver_specific": {} 00:12:53.573 } 00:12:53.573 ] 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:53.573 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.832 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:53.832 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.832 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.832 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.832 13:31:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.832 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.832 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.832 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.832 "name": "Existed_Raid", 00:12:53.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.832 "strip_size_kb": 64, 00:12:53.832 "state": "configuring", 00:12:53.832 "raid_level": "raid0", 00:12:53.832 "superblock": false, 00:12:53.832 "num_base_bdevs": 3, 00:12:53.832 "num_base_bdevs_discovered": 2, 00:12:53.832 "num_base_bdevs_operational": 3, 00:12:53.832 "base_bdevs_list": [ 00:12:53.832 { 00:12:53.832 "name": "BaseBdev1", 00:12:53.832 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:53.832 "is_configured": true, 00:12:53.832 "data_offset": 0, 00:12:53.832 "data_size": 65536 00:12:53.832 }, 00:12:53.832 { 00:12:53.832 "name": null, 00:12:53.832 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:53.832 "is_configured": false, 00:12:53.832 "data_offset": 0, 00:12:53.832 "data_size": 65536 00:12:53.832 }, 00:12:53.832 { 00:12:53.832 "name": "BaseBdev3", 00:12:53.832 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:53.832 "is_configured": true, 00:12:53.832 "data_offset": 0, 00:12:53.832 "data_size": 65536 00:12:53.832 } 00:12:53.832 ] 00:12:53.832 }' 00:12:53.832 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.832 13:31:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.448 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.448 13:31:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:54.706 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:54.706 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:54.964 [2024-07-15 13:31:34.250043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:54.964 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:54.964 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.964 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:54.964 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.965 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.222 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.222 "name": "Existed_Raid", 00:12:55.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.222 "strip_size_kb": 64, 00:12:55.222 "state": "configuring", 00:12:55.222 "raid_level": "raid0", 00:12:55.222 "superblock": false, 00:12:55.222 "num_base_bdevs": 3, 00:12:55.222 "num_base_bdevs_discovered": 1, 00:12:55.222 "num_base_bdevs_operational": 3, 00:12:55.222 "base_bdevs_list": [ 00:12:55.222 { 00:12:55.222 "name": "BaseBdev1", 00:12:55.222 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:55.222 "is_configured": true, 00:12:55.222 "data_offset": 0, 00:12:55.222 "data_size": 65536 00:12:55.222 }, 00:12:55.222 { 00:12:55.222 "name": null, 00:12:55.222 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:55.222 "is_configured": false, 00:12:55.222 "data_offset": 0, 00:12:55.222 "data_size": 65536 00:12:55.222 }, 00:12:55.222 { 00:12:55.222 "name": null, 00:12:55.222 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:55.222 "is_configured": false, 00:12:55.222 "data_offset": 0, 00:12:55.222 "data_size": 65536 00:12:55.222 } 00:12:55.222 ] 00:12:55.222 }' 00:12:55.222 13:31:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.222 13:31:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.788 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.788 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:56.045 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:56.046 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:56.304 [2024-07-15 13:31:35.621703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.304 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.564 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.564 "name": "Existed_Raid", 00:12:56.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.564 "strip_size_kb": 64, 00:12:56.564 "state": "configuring", 00:12:56.564 "raid_level": "raid0", 00:12:56.564 "superblock": false, 00:12:56.564 "num_base_bdevs": 3, 00:12:56.564 "num_base_bdevs_discovered": 2, 00:12:56.564 "num_base_bdevs_operational": 3, 00:12:56.564 "base_bdevs_list": [ 00:12:56.564 { 00:12:56.564 "name": "BaseBdev1", 00:12:56.564 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:56.564 "is_configured": true, 00:12:56.564 "data_offset": 0, 00:12:56.564 "data_size": 65536 00:12:56.564 }, 00:12:56.564 { 00:12:56.564 "name": null, 00:12:56.564 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:56.564 "is_configured": false, 00:12:56.564 "data_offset": 0, 00:12:56.564 "data_size": 65536 00:12:56.564 }, 00:12:56.564 { 00:12:56.564 "name": "BaseBdev3", 00:12:56.564 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:56.564 "is_configured": true, 00:12:56.564 "data_offset": 0, 00:12:56.564 "data_size": 65536 00:12:56.564 } 00:12:56.564 ] 00:12:56.564 }' 00:12:56.564 13:31:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.564 13:31:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:57.131 13:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.131 13:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:57.390 13:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:57.390 13:31:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:57.648 [2024-07-15 13:31:36.977337] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:57.648 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:57.648 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.649 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.907 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.907 "name": "Existed_Raid", 00:12:57.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.907 "strip_size_kb": 64, 00:12:57.907 "state": "configuring", 00:12:57.907 "raid_level": "raid0", 00:12:57.907 "superblock": false, 00:12:57.907 "num_base_bdevs": 3, 00:12:57.907 "num_base_bdevs_discovered": 1, 00:12:57.907 "num_base_bdevs_operational": 3, 00:12:57.907 "base_bdevs_list": [ 00:12:57.907 { 00:12:57.907 "name": null, 00:12:57.907 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:57.907 "is_configured": false, 00:12:57.907 "data_offset": 0, 00:12:57.907 "data_size": 65536 00:12:57.907 }, 00:12:57.907 { 00:12:57.907 "name": null, 00:12:57.907 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:57.907 "is_configured": false, 00:12:57.907 "data_offset": 0, 00:12:57.907 "data_size": 65536 00:12:57.907 }, 00:12:57.907 { 00:12:57.907 "name": "BaseBdev3", 00:12:57.907 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:57.907 "is_configured": true, 00:12:57.907 "data_offset": 0, 00:12:57.907 "data_size": 65536 00:12:57.907 } 00:12:57.907 ] 00:12:57.907 }' 00:12:57.907 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.907 13:31:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.475 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.475 13:31:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:58.734 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:58.734 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:58.993 [2024-07-15 13:31:38.348698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.993 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.252 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.252 "name": "Existed_Raid", 00:12:59.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.252 "strip_size_kb": 64, 00:12:59.252 "state": "configuring", 00:12:59.252 "raid_level": "raid0", 00:12:59.252 "superblock": false, 00:12:59.252 "num_base_bdevs": 3, 00:12:59.252 "num_base_bdevs_discovered": 2, 00:12:59.252 "num_base_bdevs_operational": 3, 00:12:59.252 "base_bdevs_list": [ 00:12:59.252 { 00:12:59.252 "name": null, 00:12:59.252 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:12:59.252 "is_configured": false, 00:12:59.252 "data_offset": 0, 00:12:59.252 "data_size": 65536 00:12:59.252 }, 00:12:59.252 { 00:12:59.252 "name": "BaseBdev2", 00:12:59.252 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:12:59.252 "is_configured": true, 00:12:59.252 "data_offset": 0, 00:12:59.252 "data_size": 65536 00:12:59.252 }, 00:12:59.252 { 00:12:59.252 "name": "BaseBdev3", 00:12:59.252 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:12:59.252 "is_configured": true, 00:12:59.252 "data_offset": 0, 00:12:59.252 "data_size": 65536 00:12:59.252 } 00:12:59.252 ] 00:12:59.252 }' 00:12:59.252 13:31:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.252 13:31:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.818 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.818 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:00.077 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:00.077 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.077 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:00.335 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c 00:13:00.595 [2024-07-15 13:31:39.949772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:00.595 [2024-07-15 13:31:39.949817] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2159450 00:13:00.595 [2024-07-15 13:31:39.949826] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:00.595 [2024-07-15 13:31:39.950034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215aa50 00:13:00.595 [2024-07-15 13:31:39.950159] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2159450 00:13:00.595 [2024-07-15 13:31:39.950169] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2159450 00:13:00.595 [2024-07-15 13:31:39.950337] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.595 NewBaseBdev 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:00.595 13:31:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:00.854 13:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:01.113 [ 00:13:01.113 { 00:13:01.113 "name": "NewBaseBdev", 00:13:01.113 "aliases": [ 00:13:01.113 "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c" 00:13:01.113 ], 00:13:01.113 "product_name": "Malloc disk", 00:13:01.113 "block_size": 512, 00:13:01.113 "num_blocks": 65536, 00:13:01.113 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:13:01.113 "assigned_rate_limits": { 00:13:01.113 "rw_ios_per_sec": 0, 00:13:01.113 "rw_mbytes_per_sec": 0, 00:13:01.113 "r_mbytes_per_sec": 0, 00:13:01.113 "w_mbytes_per_sec": 0 00:13:01.113 }, 00:13:01.113 "claimed": true, 00:13:01.113 "claim_type": "exclusive_write", 00:13:01.113 "zoned": false, 00:13:01.113 "supported_io_types": { 00:13:01.113 "read": true, 00:13:01.113 "write": true, 00:13:01.113 "unmap": true, 00:13:01.113 "flush": true, 00:13:01.113 "reset": true, 00:13:01.113 "nvme_admin": false, 00:13:01.113 "nvme_io": false, 00:13:01.113 "nvme_io_md": false, 00:13:01.113 "write_zeroes": true, 00:13:01.113 "zcopy": true, 00:13:01.113 "get_zone_info": false, 00:13:01.113 "zone_management": false, 00:13:01.113 "zone_append": false, 00:13:01.113 "compare": false, 00:13:01.113 "compare_and_write": false, 00:13:01.113 "abort": true, 00:13:01.113 "seek_hole": false, 00:13:01.113 "seek_data": false, 00:13:01.113 "copy": true, 00:13:01.113 "nvme_iov_md": false 00:13:01.113 }, 00:13:01.113 "memory_domains": [ 00:13:01.113 { 00:13:01.113 "dma_device_id": "system", 00:13:01.113 "dma_device_type": 1 00:13:01.113 }, 00:13:01.113 { 00:13:01.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.113 "dma_device_type": 2 00:13:01.113 } 00:13:01.113 ], 00:13:01.113 "driver_specific": {} 00:13:01.113 } 00:13:01.113 ] 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.113 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.374 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.374 "name": "Existed_Raid", 00:13:01.374 "uuid": "5c6548fa-de0c-47ba-990d-e67e03dedf04", 00:13:01.374 "strip_size_kb": 64, 00:13:01.374 "state": "online", 00:13:01.374 "raid_level": "raid0", 00:13:01.374 "superblock": false, 00:13:01.374 "num_base_bdevs": 3, 00:13:01.374 "num_base_bdevs_discovered": 3, 00:13:01.374 "num_base_bdevs_operational": 3, 00:13:01.374 "base_bdevs_list": [ 00:13:01.374 { 00:13:01.374 "name": "NewBaseBdev", 00:13:01.374 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:13:01.374 "is_configured": true, 00:13:01.374 "data_offset": 0, 00:13:01.374 "data_size": 65536 00:13:01.374 }, 00:13:01.374 { 00:13:01.374 "name": "BaseBdev2", 00:13:01.374 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:13:01.374 "is_configured": true, 00:13:01.374 "data_offset": 0, 00:13:01.374 "data_size": 65536 00:13:01.374 }, 00:13:01.374 { 00:13:01.374 "name": "BaseBdev3", 00:13:01.374 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:13:01.374 "is_configured": true, 00:13:01.374 "data_offset": 0, 00:13:01.374 "data_size": 65536 00:13:01.374 } 00:13:01.374 ] 00:13:01.374 }' 00:13:01.374 13:31:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.374 13:31:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:01.942 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:02.511 [2024-07-15 13:31:41.783062] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:02.511 "name": "Existed_Raid", 00:13:02.511 "aliases": [ 00:13:02.511 "5c6548fa-de0c-47ba-990d-e67e03dedf04" 00:13:02.511 ], 00:13:02.511 "product_name": "Raid Volume", 00:13:02.511 "block_size": 512, 00:13:02.511 "num_blocks": 196608, 00:13:02.511 "uuid": "5c6548fa-de0c-47ba-990d-e67e03dedf04", 00:13:02.511 "assigned_rate_limits": { 00:13:02.511 "rw_ios_per_sec": 0, 00:13:02.511 "rw_mbytes_per_sec": 0, 00:13:02.511 "r_mbytes_per_sec": 0, 00:13:02.511 "w_mbytes_per_sec": 0 00:13:02.511 }, 00:13:02.511 "claimed": false, 00:13:02.511 "zoned": false, 00:13:02.511 "supported_io_types": { 00:13:02.511 "read": true, 00:13:02.511 "write": true, 00:13:02.511 "unmap": true, 00:13:02.511 "flush": true, 00:13:02.511 "reset": true, 00:13:02.511 "nvme_admin": false, 00:13:02.511 "nvme_io": false, 00:13:02.511 "nvme_io_md": false, 00:13:02.511 "write_zeroes": true, 00:13:02.511 "zcopy": false, 00:13:02.511 "get_zone_info": false, 00:13:02.511 "zone_management": false, 00:13:02.511 "zone_append": false, 00:13:02.511 "compare": false, 00:13:02.511 "compare_and_write": false, 00:13:02.511 "abort": false, 00:13:02.511 "seek_hole": false, 00:13:02.511 "seek_data": false, 00:13:02.511 "copy": false, 00:13:02.511 "nvme_iov_md": false 00:13:02.511 }, 00:13:02.511 "memory_domains": [ 00:13:02.511 { 00:13:02.511 "dma_device_id": "system", 00:13:02.511 "dma_device_type": 1 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.511 "dma_device_type": 2 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "dma_device_id": "system", 00:13:02.511 "dma_device_type": 1 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.511 "dma_device_type": 2 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "dma_device_id": "system", 00:13:02.511 "dma_device_type": 1 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.511 "dma_device_type": 2 00:13:02.511 } 00:13:02.511 ], 00:13:02.511 "driver_specific": { 00:13:02.511 "raid": { 00:13:02.511 "uuid": "5c6548fa-de0c-47ba-990d-e67e03dedf04", 00:13:02.511 "strip_size_kb": 64, 00:13:02.511 "state": "online", 00:13:02.511 "raid_level": "raid0", 00:13:02.511 "superblock": false, 00:13:02.511 "num_base_bdevs": 3, 00:13:02.511 "num_base_bdevs_discovered": 3, 00:13:02.511 "num_base_bdevs_operational": 3, 00:13:02.511 "base_bdevs_list": [ 00:13:02.511 { 00:13:02.511 "name": "NewBaseBdev", 00:13:02.511 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:13:02.511 "is_configured": true, 00:13:02.511 "data_offset": 0, 00:13:02.511 "data_size": 65536 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "name": "BaseBdev2", 00:13:02.511 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:13:02.511 "is_configured": true, 00:13:02.511 "data_offset": 0, 00:13:02.511 "data_size": 65536 00:13:02.511 }, 00:13:02.511 { 00:13:02.511 "name": "BaseBdev3", 00:13:02.511 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:13:02.511 "is_configured": true, 00:13:02.511 "data_offset": 0, 00:13:02.511 "data_size": 65536 00:13:02.511 } 00:13:02.511 ] 00:13:02.511 } 00:13:02.511 } 00:13:02.511 }' 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:02.511 BaseBdev2 00:13:02.511 BaseBdev3' 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:02.511 13:31:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.770 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.770 "name": "NewBaseBdev", 00:13:02.770 "aliases": [ 00:13:02.770 "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c" 00:13:02.770 ], 00:13:02.770 "product_name": "Malloc disk", 00:13:02.770 "block_size": 512, 00:13:02.770 "num_blocks": 65536, 00:13:02.770 "uuid": "8d90a5c4-1e0d-4b44-aba2-13d0cfe9054c", 00:13:02.770 "assigned_rate_limits": { 00:13:02.770 "rw_ios_per_sec": 0, 00:13:02.770 "rw_mbytes_per_sec": 0, 00:13:02.770 "r_mbytes_per_sec": 0, 00:13:02.770 "w_mbytes_per_sec": 0 00:13:02.770 }, 00:13:02.770 "claimed": true, 00:13:02.770 "claim_type": "exclusive_write", 00:13:02.770 "zoned": false, 00:13:02.770 "supported_io_types": { 00:13:02.770 "read": true, 00:13:02.770 "write": true, 00:13:02.770 "unmap": true, 00:13:02.770 "flush": true, 00:13:02.770 "reset": true, 00:13:02.770 "nvme_admin": false, 00:13:02.770 "nvme_io": false, 00:13:02.770 "nvme_io_md": false, 00:13:02.770 "write_zeroes": true, 00:13:02.770 "zcopy": true, 00:13:02.770 "get_zone_info": false, 00:13:02.770 "zone_management": false, 00:13:02.770 "zone_append": false, 00:13:02.770 "compare": false, 00:13:02.770 "compare_and_write": false, 00:13:02.770 "abort": true, 00:13:02.770 "seek_hole": false, 00:13:02.770 "seek_data": false, 00:13:02.770 "copy": true, 00:13:02.770 "nvme_iov_md": false 00:13:02.770 }, 00:13:02.770 "memory_domains": [ 00:13:02.770 { 00:13:02.770 "dma_device_id": "system", 00:13:02.770 "dma_device_type": 1 00:13:02.770 }, 00:13:02.770 { 00:13:02.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.770 "dma_device_type": 2 00:13:02.770 } 00:13:02.770 ], 00:13:02.770 "driver_specific": {} 00:13:02.770 }' 00:13:02.770 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.770 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.029 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.288 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.288 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.288 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:03.288 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.548 "name": "BaseBdev2", 00:13:03.548 "aliases": [ 00:13:03.548 "d675fd08-35bd-4f01-ba10-e161a01650db" 00:13:03.548 ], 00:13:03.548 "product_name": "Malloc disk", 00:13:03.548 "block_size": 512, 00:13:03.548 "num_blocks": 65536, 00:13:03.548 "uuid": "d675fd08-35bd-4f01-ba10-e161a01650db", 00:13:03.548 "assigned_rate_limits": { 00:13:03.548 "rw_ios_per_sec": 0, 00:13:03.548 "rw_mbytes_per_sec": 0, 00:13:03.548 "r_mbytes_per_sec": 0, 00:13:03.548 "w_mbytes_per_sec": 0 00:13:03.548 }, 00:13:03.548 "claimed": true, 00:13:03.548 "claim_type": "exclusive_write", 00:13:03.548 "zoned": false, 00:13:03.548 "supported_io_types": { 00:13:03.548 "read": true, 00:13:03.548 "write": true, 00:13:03.548 "unmap": true, 00:13:03.548 "flush": true, 00:13:03.548 "reset": true, 00:13:03.548 "nvme_admin": false, 00:13:03.548 "nvme_io": false, 00:13:03.548 "nvme_io_md": false, 00:13:03.548 "write_zeroes": true, 00:13:03.548 "zcopy": true, 00:13:03.548 "get_zone_info": false, 00:13:03.548 "zone_management": false, 00:13:03.548 "zone_append": false, 00:13:03.548 "compare": false, 00:13:03.548 "compare_and_write": false, 00:13:03.548 "abort": true, 00:13:03.548 "seek_hole": false, 00:13:03.548 "seek_data": false, 00:13:03.548 "copy": true, 00:13:03.548 "nvme_iov_md": false 00:13:03.548 }, 00:13:03.548 "memory_domains": [ 00:13:03.548 { 00:13:03.548 "dma_device_id": "system", 00:13:03.548 "dma_device_type": 1 00:13:03.548 }, 00:13:03.548 { 00:13:03.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.548 "dma_device_type": 2 00:13:03.548 } 00:13:03.548 ], 00:13:03.548 "driver_specific": {} 00:13:03.548 }' 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.548 13:31:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.808 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.808 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.808 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.808 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:03.808 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.066 "name": "BaseBdev3", 00:13:04.066 "aliases": [ 00:13:04.066 "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3" 00:13:04.066 ], 00:13:04.066 "product_name": "Malloc disk", 00:13:04.066 "block_size": 512, 00:13:04.066 "num_blocks": 65536, 00:13:04.066 "uuid": "c15f07b4-d8cd-4fb7-a800-b3f28e924ad3", 00:13:04.066 "assigned_rate_limits": { 00:13:04.066 "rw_ios_per_sec": 0, 00:13:04.066 "rw_mbytes_per_sec": 0, 00:13:04.066 "r_mbytes_per_sec": 0, 00:13:04.066 "w_mbytes_per_sec": 0 00:13:04.066 }, 00:13:04.066 "claimed": true, 00:13:04.066 "claim_type": "exclusive_write", 00:13:04.066 "zoned": false, 00:13:04.066 "supported_io_types": { 00:13:04.066 "read": true, 00:13:04.066 "write": true, 00:13:04.066 "unmap": true, 00:13:04.066 "flush": true, 00:13:04.066 "reset": true, 00:13:04.066 "nvme_admin": false, 00:13:04.066 "nvme_io": false, 00:13:04.066 "nvme_io_md": false, 00:13:04.066 "write_zeroes": true, 00:13:04.066 "zcopy": true, 00:13:04.066 "get_zone_info": false, 00:13:04.066 "zone_management": false, 00:13:04.066 "zone_append": false, 00:13:04.066 "compare": false, 00:13:04.066 "compare_and_write": false, 00:13:04.066 "abort": true, 00:13:04.066 "seek_hole": false, 00:13:04.066 "seek_data": false, 00:13:04.066 "copy": true, 00:13:04.066 "nvme_iov_md": false 00:13:04.066 }, 00:13:04.066 "memory_domains": [ 00:13:04.066 { 00:13:04.066 "dma_device_id": "system", 00:13:04.066 "dma_device_type": 1 00:13:04.066 }, 00:13:04.066 { 00:13:04.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.066 "dma_device_type": 2 00:13:04.066 } 00:13:04.066 ], 00:13:04.066 "driver_specific": {} 00:13:04.066 }' 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.066 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.323 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.323 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.323 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.323 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.324 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.324 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.324 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:04.583 [2024-07-15 13:31:43.892380] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:04.583 [2024-07-15 13:31:43.892414] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:04.583 [2024-07-15 13:31:43.892465] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:04.583 [2024-07-15 13:31:43.892514] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:04.583 [2024-07-15 13:31:43.892526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2159450 name Existed_Raid, state offline 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2087394 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2087394 ']' 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2087394 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2087394 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2087394' 00:13:04.583 killing process with pid 2087394 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2087394 00:13:04.583 [2024-07-15 13:31:43.956960] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:04.583 13:31:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2087394 00:13:04.583 [2024-07-15 13:31:44.005519] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:05.152 00:13:05.152 real 0m28.728s 00:13:05.152 user 0m52.706s 00:13:05.152 sys 0m4.928s 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.152 ************************************ 00:13:05.152 END TEST raid_state_function_test 00:13:05.152 ************************************ 00:13:05.152 13:31:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:05.152 13:31:44 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:05.152 13:31:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:05.152 13:31:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:05.152 13:31:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:05.152 ************************************ 00:13:05.152 START TEST raid_state_function_test_sb 00:13:05.152 ************************************ 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:05.152 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2091738 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2091738' 00:13:05.153 Process raid pid: 2091738 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2091738 /var/tmp/spdk-raid.sock 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2091738 ']' 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:05.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:05.153 13:31:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:05.153 [2024-07-15 13:31:44.516617] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:13:05.153 [2024-07-15 13:31:44.516688] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:05.412 [2024-07-15 13:31:44.649150] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:05.412 [2024-07-15 13:31:44.757159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:05.412 [2024-07-15 13:31:44.828690] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:05.412 [2024-07-15 13:31:44.828727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:06.351 [2024-07-15 13:31:45.588693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:06.351 [2024-07-15 13:31:45.588739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:06.351 [2024-07-15 13:31:45.588750] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:06.351 [2024-07-15 13:31:45.588762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:06.351 [2024-07-15 13:31:45.588771] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:06.351 [2024-07-15 13:31:45.588782] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.351 13:31:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.919 13:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.919 "name": "Existed_Raid", 00:13:06.919 "uuid": "e0053620-b39e-4c73-8ede-264269097e45", 00:13:06.919 "strip_size_kb": 64, 00:13:06.919 "state": "configuring", 00:13:06.919 "raid_level": "raid0", 00:13:06.919 "superblock": true, 00:13:06.919 "num_base_bdevs": 3, 00:13:06.919 "num_base_bdevs_discovered": 0, 00:13:06.919 "num_base_bdevs_operational": 3, 00:13:06.919 "base_bdevs_list": [ 00:13:06.919 { 00:13:06.919 "name": "BaseBdev1", 00:13:06.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.919 "is_configured": false, 00:13:06.919 "data_offset": 0, 00:13:06.919 "data_size": 0 00:13:06.919 }, 00:13:06.919 { 00:13:06.919 "name": "BaseBdev2", 00:13:06.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.919 "is_configured": false, 00:13:06.919 "data_offset": 0, 00:13:06.919 "data_size": 0 00:13:06.919 }, 00:13:06.919 { 00:13:06.919 "name": "BaseBdev3", 00:13:06.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.919 "is_configured": false, 00:13:06.919 "data_offset": 0, 00:13:06.919 "data_size": 0 00:13:06.919 } 00:13:06.919 ] 00:13:06.919 }' 00:13:06.919 13:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.920 13:31:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:07.488 13:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:07.748 [2024-07-15 13:31:46.916056] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:07.748 [2024-07-15 13:31:46.916084] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8c7a80 name Existed_Raid, state configuring 00:13:07.748 13:31:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:07.748 [2024-07-15 13:31:47.164735] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:07.748 [2024-07-15 13:31:47.164760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:07.748 [2024-07-15 13:31:47.164769] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:07.748 [2024-07-15 13:31:47.164788] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:07.748 [2024-07-15 13:31:47.164797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:07.748 [2024-07-15 13:31:47.164808] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:08.007 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:08.007 [2024-07-15 13:31:47.415263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:08.007 BaseBdev1 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.266 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:08.526 [ 00:13:08.526 { 00:13:08.526 "name": "BaseBdev1", 00:13:08.526 "aliases": [ 00:13:08.526 "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63" 00:13:08.526 ], 00:13:08.526 "product_name": "Malloc disk", 00:13:08.526 "block_size": 512, 00:13:08.526 "num_blocks": 65536, 00:13:08.526 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:08.526 "assigned_rate_limits": { 00:13:08.526 "rw_ios_per_sec": 0, 00:13:08.526 "rw_mbytes_per_sec": 0, 00:13:08.526 "r_mbytes_per_sec": 0, 00:13:08.526 "w_mbytes_per_sec": 0 00:13:08.526 }, 00:13:08.526 "claimed": true, 00:13:08.526 "claim_type": "exclusive_write", 00:13:08.526 "zoned": false, 00:13:08.526 "supported_io_types": { 00:13:08.526 "read": true, 00:13:08.526 "write": true, 00:13:08.526 "unmap": true, 00:13:08.526 "flush": true, 00:13:08.526 "reset": true, 00:13:08.526 "nvme_admin": false, 00:13:08.526 "nvme_io": false, 00:13:08.526 "nvme_io_md": false, 00:13:08.526 "write_zeroes": true, 00:13:08.526 "zcopy": true, 00:13:08.526 "get_zone_info": false, 00:13:08.526 "zone_management": false, 00:13:08.526 "zone_append": false, 00:13:08.526 "compare": false, 00:13:08.526 "compare_and_write": false, 00:13:08.526 "abort": true, 00:13:08.526 "seek_hole": false, 00:13:08.526 "seek_data": false, 00:13:08.526 "copy": true, 00:13:08.526 "nvme_iov_md": false 00:13:08.526 }, 00:13:08.526 "memory_domains": [ 00:13:08.526 { 00:13:08.526 "dma_device_id": "system", 00:13:08.526 "dma_device_type": 1 00:13:08.526 }, 00:13:08.526 { 00:13:08.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.526 "dma_device_type": 2 00:13:08.526 } 00:13:08.526 ], 00:13:08.526 "driver_specific": {} 00:13:08.526 } 00:13:08.526 ] 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.526 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.527 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.527 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.527 13:31:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.095 13:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.095 "name": "Existed_Raid", 00:13:09.095 "uuid": "c3eb9a7b-225b-4d77-acfe-819c525bf884", 00:13:09.095 "strip_size_kb": 64, 00:13:09.095 "state": "configuring", 00:13:09.095 "raid_level": "raid0", 00:13:09.095 "superblock": true, 00:13:09.095 "num_base_bdevs": 3, 00:13:09.095 "num_base_bdevs_discovered": 1, 00:13:09.095 "num_base_bdevs_operational": 3, 00:13:09.095 "base_bdevs_list": [ 00:13:09.095 { 00:13:09.095 "name": "BaseBdev1", 00:13:09.095 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:09.095 "is_configured": true, 00:13:09.095 "data_offset": 2048, 00:13:09.095 "data_size": 63488 00:13:09.095 }, 00:13:09.095 { 00:13:09.095 "name": "BaseBdev2", 00:13:09.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.095 "is_configured": false, 00:13:09.095 "data_offset": 0, 00:13:09.095 "data_size": 0 00:13:09.095 }, 00:13:09.095 { 00:13:09.095 "name": "BaseBdev3", 00:13:09.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.095 "is_configured": false, 00:13:09.095 "data_offset": 0, 00:13:09.095 "data_size": 0 00:13:09.095 } 00:13:09.095 ] 00:13:09.095 }' 00:13:09.095 13:31:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.095 13:31:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.663 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.923 [2024-07-15 13:31:49.252112] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.923 [2024-07-15 13:31:49.252151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8c7310 name Existed_Raid, state configuring 00:13:09.923 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:10.182 [2024-07-15 13:31:49.496802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.182 [2024-07-15 13:31:49.498246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:10.182 [2024-07-15 13:31:49.498278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:10.182 [2024-07-15 13:31:49.498288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:10.182 [2024-07-15 13:31:49.498300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.182 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.441 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.441 "name": "Existed_Raid", 00:13:10.441 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:10.441 "strip_size_kb": 64, 00:13:10.441 "state": "configuring", 00:13:10.441 "raid_level": "raid0", 00:13:10.441 "superblock": true, 00:13:10.441 "num_base_bdevs": 3, 00:13:10.441 "num_base_bdevs_discovered": 1, 00:13:10.441 "num_base_bdevs_operational": 3, 00:13:10.441 "base_bdevs_list": [ 00:13:10.441 { 00:13:10.441 "name": "BaseBdev1", 00:13:10.441 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:10.441 "is_configured": true, 00:13:10.441 "data_offset": 2048, 00:13:10.441 "data_size": 63488 00:13:10.441 }, 00:13:10.441 { 00:13:10.441 "name": "BaseBdev2", 00:13:10.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.441 "is_configured": false, 00:13:10.441 "data_offset": 0, 00:13:10.441 "data_size": 0 00:13:10.441 }, 00:13:10.441 { 00:13:10.441 "name": "BaseBdev3", 00:13:10.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.441 "is_configured": false, 00:13:10.441 "data_offset": 0, 00:13:10.441 "data_size": 0 00:13:10.441 } 00:13:10.441 ] 00:13:10.441 }' 00:13:10.441 13:31:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.441 13:31:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.037 13:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:11.296 [2024-07-15 13:31:50.583069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:11.296 BaseBdev2 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:11.296 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:11.554 13:31:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:11.813 [ 00:13:11.813 { 00:13:11.813 "name": "BaseBdev2", 00:13:11.813 "aliases": [ 00:13:11.813 "b8f2e3b4-456f-41ce-9814-7f27ab16f258" 00:13:11.813 ], 00:13:11.813 "product_name": "Malloc disk", 00:13:11.813 "block_size": 512, 00:13:11.813 "num_blocks": 65536, 00:13:11.813 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:11.813 "assigned_rate_limits": { 00:13:11.813 "rw_ios_per_sec": 0, 00:13:11.813 "rw_mbytes_per_sec": 0, 00:13:11.813 "r_mbytes_per_sec": 0, 00:13:11.813 "w_mbytes_per_sec": 0 00:13:11.813 }, 00:13:11.813 "claimed": true, 00:13:11.813 "claim_type": "exclusive_write", 00:13:11.813 "zoned": false, 00:13:11.813 "supported_io_types": { 00:13:11.813 "read": true, 00:13:11.813 "write": true, 00:13:11.813 "unmap": true, 00:13:11.813 "flush": true, 00:13:11.813 "reset": true, 00:13:11.813 "nvme_admin": false, 00:13:11.813 "nvme_io": false, 00:13:11.813 "nvme_io_md": false, 00:13:11.813 "write_zeroes": true, 00:13:11.813 "zcopy": true, 00:13:11.813 "get_zone_info": false, 00:13:11.813 "zone_management": false, 00:13:11.813 "zone_append": false, 00:13:11.813 "compare": false, 00:13:11.813 "compare_and_write": false, 00:13:11.813 "abort": true, 00:13:11.813 "seek_hole": false, 00:13:11.813 "seek_data": false, 00:13:11.813 "copy": true, 00:13:11.813 "nvme_iov_md": false 00:13:11.813 }, 00:13:11.813 "memory_domains": [ 00:13:11.813 { 00:13:11.813 "dma_device_id": "system", 00:13:11.813 "dma_device_type": 1 00:13:11.813 }, 00:13:11.813 { 00:13:11.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.813 "dma_device_type": 2 00:13:11.813 } 00:13:11.813 ], 00:13:11.813 "driver_specific": {} 00:13:11.813 } 00:13:11.813 ] 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.813 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.072 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.072 "name": "Existed_Raid", 00:13:12.072 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:12.072 "strip_size_kb": 64, 00:13:12.072 "state": "configuring", 00:13:12.072 "raid_level": "raid0", 00:13:12.072 "superblock": true, 00:13:12.072 "num_base_bdevs": 3, 00:13:12.072 "num_base_bdevs_discovered": 2, 00:13:12.072 "num_base_bdevs_operational": 3, 00:13:12.072 "base_bdevs_list": [ 00:13:12.072 { 00:13:12.072 "name": "BaseBdev1", 00:13:12.072 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:12.072 "is_configured": true, 00:13:12.072 "data_offset": 2048, 00:13:12.072 "data_size": 63488 00:13:12.072 }, 00:13:12.072 { 00:13:12.072 "name": "BaseBdev2", 00:13:12.072 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:12.072 "is_configured": true, 00:13:12.072 "data_offset": 2048, 00:13:12.072 "data_size": 63488 00:13:12.072 }, 00:13:12.072 { 00:13:12.072 "name": "BaseBdev3", 00:13:12.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.072 "is_configured": false, 00:13:12.072 "data_offset": 0, 00:13:12.072 "data_size": 0 00:13:12.072 } 00:13:12.072 ] 00:13:12.072 }' 00:13:12.072 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.072 13:31:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.641 13:31:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:12.900 [2024-07-15 13:31:52.162913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:12.900 [2024-07-15 13:31:52.163094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8c8400 00:13:12.900 [2024-07-15 13:31:52.163108] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:12.900 [2024-07-15 13:31:52.163281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8c7ef0 00:13:12.900 [2024-07-15 13:31:52.163393] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8c8400 00:13:12.900 [2024-07-15 13:31:52.163403] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8c8400 00:13:12.900 [2024-07-15 13:31:52.163493] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.900 BaseBdev3 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:12.900 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.159 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:13.418 [ 00:13:13.418 { 00:13:13.418 "name": "BaseBdev3", 00:13:13.418 "aliases": [ 00:13:13.418 "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa" 00:13:13.418 ], 00:13:13.418 "product_name": "Malloc disk", 00:13:13.418 "block_size": 512, 00:13:13.418 "num_blocks": 65536, 00:13:13.418 "uuid": "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa", 00:13:13.418 "assigned_rate_limits": { 00:13:13.418 "rw_ios_per_sec": 0, 00:13:13.418 "rw_mbytes_per_sec": 0, 00:13:13.418 "r_mbytes_per_sec": 0, 00:13:13.418 "w_mbytes_per_sec": 0 00:13:13.418 }, 00:13:13.418 "claimed": true, 00:13:13.418 "claim_type": "exclusive_write", 00:13:13.418 "zoned": false, 00:13:13.418 "supported_io_types": { 00:13:13.418 "read": true, 00:13:13.418 "write": true, 00:13:13.418 "unmap": true, 00:13:13.418 "flush": true, 00:13:13.418 "reset": true, 00:13:13.418 "nvme_admin": false, 00:13:13.418 "nvme_io": false, 00:13:13.418 "nvme_io_md": false, 00:13:13.418 "write_zeroes": true, 00:13:13.418 "zcopy": true, 00:13:13.418 "get_zone_info": false, 00:13:13.418 "zone_management": false, 00:13:13.418 "zone_append": false, 00:13:13.418 "compare": false, 00:13:13.418 "compare_and_write": false, 00:13:13.418 "abort": true, 00:13:13.418 "seek_hole": false, 00:13:13.418 "seek_data": false, 00:13:13.418 "copy": true, 00:13:13.418 "nvme_iov_md": false 00:13:13.418 }, 00:13:13.418 "memory_domains": [ 00:13:13.418 { 00:13:13.418 "dma_device_id": "system", 00:13:13.418 "dma_device_type": 1 00:13:13.418 }, 00:13:13.418 { 00:13:13.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.418 "dma_device_type": 2 00:13:13.418 } 00:13:13.418 ], 00:13:13.418 "driver_specific": {} 00:13:13.418 } 00:13:13.418 ] 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.418 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.676 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.676 "name": "Existed_Raid", 00:13:13.676 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:13.676 "strip_size_kb": 64, 00:13:13.676 "state": "online", 00:13:13.676 "raid_level": "raid0", 00:13:13.676 "superblock": true, 00:13:13.676 "num_base_bdevs": 3, 00:13:13.676 "num_base_bdevs_discovered": 3, 00:13:13.676 "num_base_bdevs_operational": 3, 00:13:13.676 "base_bdevs_list": [ 00:13:13.676 { 00:13:13.676 "name": "BaseBdev1", 00:13:13.676 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:13.676 "is_configured": true, 00:13:13.676 "data_offset": 2048, 00:13:13.676 "data_size": 63488 00:13:13.676 }, 00:13:13.676 { 00:13:13.676 "name": "BaseBdev2", 00:13:13.676 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:13.676 "is_configured": true, 00:13:13.676 "data_offset": 2048, 00:13:13.676 "data_size": 63488 00:13:13.676 }, 00:13:13.676 { 00:13:13.676 "name": "BaseBdev3", 00:13:13.676 "uuid": "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa", 00:13:13.676 "is_configured": true, 00:13:13.676 "data_offset": 2048, 00:13:13.676 "data_size": 63488 00:13:13.676 } 00:13:13.676 ] 00:13:13.676 }' 00:13:13.676 13:31:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.676 13:31:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.242 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.242 [2024-07-15 13:31:53.659178] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.500 "name": "Existed_Raid", 00:13:14.500 "aliases": [ 00:13:14.500 "ae8cc548-2537-41b4-9035-dd147bdfa952" 00:13:14.500 ], 00:13:14.500 "product_name": "Raid Volume", 00:13:14.500 "block_size": 512, 00:13:14.500 "num_blocks": 190464, 00:13:14.500 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:14.500 "assigned_rate_limits": { 00:13:14.500 "rw_ios_per_sec": 0, 00:13:14.500 "rw_mbytes_per_sec": 0, 00:13:14.500 "r_mbytes_per_sec": 0, 00:13:14.500 "w_mbytes_per_sec": 0 00:13:14.500 }, 00:13:14.500 "claimed": false, 00:13:14.500 "zoned": false, 00:13:14.500 "supported_io_types": { 00:13:14.500 "read": true, 00:13:14.500 "write": true, 00:13:14.500 "unmap": true, 00:13:14.500 "flush": true, 00:13:14.500 "reset": true, 00:13:14.500 "nvme_admin": false, 00:13:14.500 "nvme_io": false, 00:13:14.500 "nvme_io_md": false, 00:13:14.500 "write_zeroes": true, 00:13:14.500 "zcopy": false, 00:13:14.500 "get_zone_info": false, 00:13:14.500 "zone_management": false, 00:13:14.500 "zone_append": false, 00:13:14.500 "compare": false, 00:13:14.500 "compare_and_write": false, 00:13:14.500 "abort": false, 00:13:14.500 "seek_hole": false, 00:13:14.500 "seek_data": false, 00:13:14.500 "copy": false, 00:13:14.500 "nvme_iov_md": false 00:13:14.500 }, 00:13:14.500 "memory_domains": [ 00:13:14.500 { 00:13:14.500 "dma_device_id": "system", 00:13:14.500 "dma_device_type": 1 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.500 "dma_device_type": 2 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "dma_device_id": "system", 00:13:14.500 "dma_device_type": 1 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.500 "dma_device_type": 2 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "dma_device_id": "system", 00:13:14.500 "dma_device_type": 1 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.500 "dma_device_type": 2 00:13:14.500 } 00:13:14.500 ], 00:13:14.500 "driver_specific": { 00:13:14.500 "raid": { 00:13:14.500 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:14.500 "strip_size_kb": 64, 00:13:14.500 "state": "online", 00:13:14.500 "raid_level": "raid0", 00:13:14.500 "superblock": true, 00:13:14.500 "num_base_bdevs": 3, 00:13:14.500 "num_base_bdevs_discovered": 3, 00:13:14.500 "num_base_bdevs_operational": 3, 00:13:14.500 "base_bdevs_list": [ 00:13:14.500 { 00:13:14.500 "name": "BaseBdev1", 00:13:14.500 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:14.500 "is_configured": true, 00:13:14.500 "data_offset": 2048, 00:13:14.500 "data_size": 63488 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "name": "BaseBdev2", 00:13:14.500 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:14.500 "is_configured": true, 00:13:14.500 "data_offset": 2048, 00:13:14.500 "data_size": 63488 00:13:14.500 }, 00:13:14.500 { 00:13:14.500 "name": "BaseBdev3", 00:13:14.500 "uuid": "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa", 00:13:14.500 "is_configured": true, 00:13:14.500 "data_offset": 2048, 00:13:14.500 "data_size": 63488 00:13:14.500 } 00:13:14.500 ] 00:13:14.500 } 00:13:14.500 } 00:13:14.500 }' 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:14.500 BaseBdev2 00:13:14.500 BaseBdev3' 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:14.500 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.759 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.759 "name": "BaseBdev1", 00:13:14.759 "aliases": [ 00:13:14.759 "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63" 00:13:14.759 ], 00:13:14.759 "product_name": "Malloc disk", 00:13:14.759 "block_size": 512, 00:13:14.759 "num_blocks": 65536, 00:13:14.759 "uuid": "892cb91c-e1dc-4da5-9b8b-5fe8ef9f8f63", 00:13:14.759 "assigned_rate_limits": { 00:13:14.759 "rw_ios_per_sec": 0, 00:13:14.759 "rw_mbytes_per_sec": 0, 00:13:14.759 "r_mbytes_per_sec": 0, 00:13:14.759 "w_mbytes_per_sec": 0 00:13:14.759 }, 00:13:14.759 "claimed": true, 00:13:14.759 "claim_type": "exclusive_write", 00:13:14.759 "zoned": false, 00:13:14.759 "supported_io_types": { 00:13:14.759 "read": true, 00:13:14.759 "write": true, 00:13:14.759 "unmap": true, 00:13:14.759 "flush": true, 00:13:14.759 "reset": true, 00:13:14.759 "nvme_admin": false, 00:13:14.759 "nvme_io": false, 00:13:14.759 "nvme_io_md": false, 00:13:14.759 "write_zeroes": true, 00:13:14.759 "zcopy": true, 00:13:14.759 "get_zone_info": false, 00:13:14.759 "zone_management": false, 00:13:14.759 "zone_append": false, 00:13:14.759 "compare": false, 00:13:14.759 "compare_and_write": false, 00:13:14.759 "abort": true, 00:13:14.759 "seek_hole": false, 00:13:14.759 "seek_data": false, 00:13:14.759 "copy": true, 00:13:14.759 "nvme_iov_md": false 00:13:14.759 }, 00:13:14.759 "memory_domains": [ 00:13:14.759 { 00:13:14.759 "dma_device_id": "system", 00:13:14.759 "dma_device_type": 1 00:13:14.759 }, 00:13:14.759 { 00:13:14.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.759 "dma_device_type": 2 00:13:14.759 } 00:13:14.759 ], 00:13:14.759 "driver_specific": {} 00:13:14.759 }' 00:13:14.759 13:31:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.759 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.018 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.276 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.276 "name": "BaseBdev2", 00:13:15.276 "aliases": [ 00:13:15.276 "b8f2e3b4-456f-41ce-9814-7f27ab16f258" 00:13:15.276 ], 00:13:15.276 "product_name": "Malloc disk", 00:13:15.276 "block_size": 512, 00:13:15.277 "num_blocks": 65536, 00:13:15.277 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:15.277 "assigned_rate_limits": { 00:13:15.277 "rw_ios_per_sec": 0, 00:13:15.277 "rw_mbytes_per_sec": 0, 00:13:15.277 "r_mbytes_per_sec": 0, 00:13:15.277 "w_mbytes_per_sec": 0 00:13:15.277 }, 00:13:15.277 "claimed": true, 00:13:15.277 "claim_type": "exclusive_write", 00:13:15.277 "zoned": false, 00:13:15.277 "supported_io_types": { 00:13:15.277 "read": true, 00:13:15.277 "write": true, 00:13:15.277 "unmap": true, 00:13:15.277 "flush": true, 00:13:15.277 "reset": true, 00:13:15.277 "nvme_admin": false, 00:13:15.277 "nvme_io": false, 00:13:15.277 "nvme_io_md": false, 00:13:15.277 "write_zeroes": true, 00:13:15.277 "zcopy": true, 00:13:15.277 "get_zone_info": false, 00:13:15.277 "zone_management": false, 00:13:15.277 "zone_append": false, 00:13:15.277 "compare": false, 00:13:15.277 "compare_and_write": false, 00:13:15.277 "abort": true, 00:13:15.277 "seek_hole": false, 00:13:15.277 "seek_data": false, 00:13:15.277 "copy": true, 00:13:15.277 "nvme_iov_md": false 00:13:15.277 }, 00:13:15.277 "memory_domains": [ 00:13:15.277 { 00:13:15.277 "dma_device_id": "system", 00:13:15.277 "dma_device_type": 1 00:13:15.277 }, 00:13:15.277 { 00:13:15.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.277 "dma_device_type": 2 00:13:15.277 } 00:13:15.277 ], 00:13:15.277 "driver_specific": {} 00:13:15.277 }' 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.277 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:15.536 13:31:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.794 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.794 "name": "BaseBdev3", 00:13:15.794 "aliases": [ 00:13:15.794 "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa" 00:13:15.794 ], 00:13:15.794 "product_name": "Malloc disk", 00:13:15.794 "block_size": 512, 00:13:15.794 "num_blocks": 65536, 00:13:15.794 "uuid": "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa", 00:13:15.794 "assigned_rate_limits": { 00:13:15.794 "rw_ios_per_sec": 0, 00:13:15.794 "rw_mbytes_per_sec": 0, 00:13:15.794 "r_mbytes_per_sec": 0, 00:13:15.794 "w_mbytes_per_sec": 0 00:13:15.794 }, 00:13:15.794 "claimed": true, 00:13:15.794 "claim_type": "exclusive_write", 00:13:15.794 "zoned": false, 00:13:15.794 "supported_io_types": { 00:13:15.794 "read": true, 00:13:15.794 "write": true, 00:13:15.794 "unmap": true, 00:13:15.794 "flush": true, 00:13:15.794 "reset": true, 00:13:15.794 "nvme_admin": false, 00:13:15.794 "nvme_io": false, 00:13:15.794 "nvme_io_md": false, 00:13:15.794 "write_zeroes": true, 00:13:15.794 "zcopy": true, 00:13:15.794 "get_zone_info": false, 00:13:15.794 "zone_management": false, 00:13:15.794 "zone_append": false, 00:13:15.794 "compare": false, 00:13:15.794 "compare_and_write": false, 00:13:15.794 "abort": true, 00:13:15.794 "seek_hole": false, 00:13:15.794 "seek_data": false, 00:13:15.794 "copy": true, 00:13:15.794 "nvme_iov_md": false 00:13:15.794 }, 00:13:15.794 "memory_domains": [ 00:13:15.794 { 00:13:15.794 "dma_device_id": "system", 00:13:15.794 "dma_device_type": 1 00:13:15.794 }, 00:13:15.794 { 00:13:15.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.794 "dma_device_type": 2 00:13:15.794 } 00:13:15.794 ], 00:13:15.794 "driver_specific": {} 00:13:15.794 }' 00:13:15.794 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.794 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.053 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.311 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:16.311 13:31:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:16.569 [2024-07-15 13:31:55.977079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:16.569 [2024-07-15 13:31:55.977107] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:16.569 [2024-07-15 13:31:55.977151] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.828 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.087 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.087 "name": "Existed_Raid", 00:13:17.087 "uuid": "ae8cc548-2537-41b4-9035-dd147bdfa952", 00:13:17.087 "strip_size_kb": 64, 00:13:17.087 "state": "offline", 00:13:17.087 "raid_level": "raid0", 00:13:17.087 "superblock": true, 00:13:17.087 "num_base_bdevs": 3, 00:13:17.087 "num_base_bdevs_discovered": 2, 00:13:17.087 "num_base_bdevs_operational": 2, 00:13:17.087 "base_bdevs_list": [ 00:13:17.087 { 00:13:17.087 "name": null, 00:13:17.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.087 "is_configured": false, 00:13:17.087 "data_offset": 2048, 00:13:17.087 "data_size": 63488 00:13:17.087 }, 00:13:17.087 { 00:13:17.087 "name": "BaseBdev2", 00:13:17.087 "uuid": "b8f2e3b4-456f-41ce-9814-7f27ab16f258", 00:13:17.087 "is_configured": true, 00:13:17.087 "data_offset": 2048, 00:13:17.087 "data_size": 63488 00:13:17.087 }, 00:13:17.087 { 00:13:17.087 "name": "BaseBdev3", 00:13:17.087 "uuid": "f6ed96f1-a7d5-454e-9b8f-613dcd71b0fa", 00:13:17.087 "is_configured": true, 00:13:17.087 "data_offset": 2048, 00:13:17.087 "data_size": 63488 00:13:17.087 } 00:13:17.087 ] 00:13:17.087 }' 00:13:17.087 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.087 13:31:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.655 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:17.655 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.655 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:17.655 13:31:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.914 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:17.914 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:17.914 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:17.914 [2024-07-15 13:31:57.330585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:18.173 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:18.173 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:18.173 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.173 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:18.432 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:18.432 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:18.433 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:18.433 [2024-07-15 13:31:57.831995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:18.433 [2024-07-15 13:31:57.832038] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8c8400 name Existed_Raid, state offline 00:13:18.692 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:18.692 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:18.692 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.692 13:31:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:18.692 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:18.951 BaseBdev2 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:18.951 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:19.210 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:19.469 [ 00:13:19.469 { 00:13:19.469 "name": "BaseBdev2", 00:13:19.469 "aliases": [ 00:13:19.469 "e3e0db57-4961-4145-86b4-b9c4f361043e" 00:13:19.469 ], 00:13:19.469 "product_name": "Malloc disk", 00:13:19.469 "block_size": 512, 00:13:19.469 "num_blocks": 65536, 00:13:19.469 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:19.469 "assigned_rate_limits": { 00:13:19.469 "rw_ios_per_sec": 0, 00:13:19.469 "rw_mbytes_per_sec": 0, 00:13:19.469 "r_mbytes_per_sec": 0, 00:13:19.469 "w_mbytes_per_sec": 0 00:13:19.469 }, 00:13:19.469 "claimed": false, 00:13:19.469 "zoned": false, 00:13:19.469 "supported_io_types": { 00:13:19.469 "read": true, 00:13:19.469 "write": true, 00:13:19.469 "unmap": true, 00:13:19.469 "flush": true, 00:13:19.469 "reset": true, 00:13:19.469 "nvme_admin": false, 00:13:19.469 "nvme_io": false, 00:13:19.469 "nvme_io_md": false, 00:13:19.469 "write_zeroes": true, 00:13:19.469 "zcopy": true, 00:13:19.469 "get_zone_info": false, 00:13:19.469 "zone_management": false, 00:13:19.469 "zone_append": false, 00:13:19.469 "compare": false, 00:13:19.469 "compare_and_write": false, 00:13:19.469 "abort": true, 00:13:19.469 "seek_hole": false, 00:13:19.469 "seek_data": false, 00:13:19.469 "copy": true, 00:13:19.469 "nvme_iov_md": false 00:13:19.469 }, 00:13:19.469 "memory_domains": [ 00:13:19.469 { 00:13:19.469 "dma_device_id": "system", 00:13:19.469 "dma_device_type": 1 00:13:19.469 }, 00:13:19.469 { 00:13:19.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.469 "dma_device_type": 2 00:13:19.469 } 00:13:19.469 ], 00:13:19.469 "driver_specific": {} 00:13:19.469 } 00:13:19.469 ] 00:13:19.469 13:31:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:19.469 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:19.469 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:19.469 13:31:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:19.728 BaseBdev3 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:19.728 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:19.987 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:20.246 [ 00:13:20.246 { 00:13:20.246 "name": "BaseBdev3", 00:13:20.246 "aliases": [ 00:13:20.246 "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5" 00:13:20.246 ], 00:13:20.246 "product_name": "Malloc disk", 00:13:20.246 "block_size": 512, 00:13:20.246 "num_blocks": 65536, 00:13:20.246 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:20.246 "assigned_rate_limits": { 00:13:20.246 "rw_ios_per_sec": 0, 00:13:20.246 "rw_mbytes_per_sec": 0, 00:13:20.246 "r_mbytes_per_sec": 0, 00:13:20.246 "w_mbytes_per_sec": 0 00:13:20.246 }, 00:13:20.246 "claimed": false, 00:13:20.246 "zoned": false, 00:13:20.246 "supported_io_types": { 00:13:20.246 "read": true, 00:13:20.246 "write": true, 00:13:20.246 "unmap": true, 00:13:20.246 "flush": true, 00:13:20.246 "reset": true, 00:13:20.246 "nvme_admin": false, 00:13:20.246 "nvme_io": false, 00:13:20.246 "nvme_io_md": false, 00:13:20.246 "write_zeroes": true, 00:13:20.246 "zcopy": true, 00:13:20.246 "get_zone_info": false, 00:13:20.246 "zone_management": false, 00:13:20.246 "zone_append": false, 00:13:20.246 "compare": false, 00:13:20.246 "compare_and_write": false, 00:13:20.246 "abort": true, 00:13:20.246 "seek_hole": false, 00:13:20.246 "seek_data": false, 00:13:20.246 "copy": true, 00:13:20.246 "nvme_iov_md": false 00:13:20.246 }, 00:13:20.246 "memory_domains": [ 00:13:20.246 { 00:13:20.246 "dma_device_id": "system", 00:13:20.246 "dma_device_type": 1 00:13:20.246 }, 00:13:20.246 { 00:13:20.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.246 "dma_device_type": 2 00:13:20.246 } 00:13:20.246 ], 00:13:20.246 "driver_specific": {} 00:13:20.246 } 00:13:20.247 ] 00:13:20.247 13:31:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:20.247 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:20.247 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:20.247 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:20.505 [2024-07-15 13:31:59.814032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:20.505 [2024-07-15 13:31:59.814072] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:20.505 [2024-07-15 13:31:59.814090] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:20.505 [2024-07-15 13:31:59.815411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.505 13:31:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.762 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.762 "name": "Existed_Raid", 00:13:20.762 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:20.762 "strip_size_kb": 64, 00:13:20.762 "state": "configuring", 00:13:20.762 "raid_level": "raid0", 00:13:20.762 "superblock": true, 00:13:20.762 "num_base_bdevs": 3, 00:13:20.762 "num_base_bdevs_discovered": 2, 00:13:20.762 "num_base_bdevs_operational": 3, 00:13:20.762 "base_bdevs_list": [ 00:13:20.762 { 00:13:20.763 "name": "BaseBdev1", 00:13:20.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.763 "is_configured": false, 00:13:20.763 "data_offset": 0, 00:13:20.763 "data_size": 0 00:13:20.763 }, 00:13:20.763 { 00:13:20.763 "name": "BaseBdev2", 00:13:20.763 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:20.763 "is_configured": true, 00:13:20.763 "data_offset": 2048, 00:13:20.763 "data_size": 63488 00:13:20.763 }, 00:13:20.763 { 00:13:20.763 "name": "BaseBdev3", 00:13:20.763 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:20.763 "is_configured": true, 00:13:20.763 "data_offset": 2048, 00:13:20.763 "data_size": 63488 00:13:20.763 } 00:13:20.763 ] 00:13:20.763 }' 00:13:20.763 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.763 13:32:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.329 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:21.588 [2024-07-15 13:32:00.888846] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.588 13:32:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.847 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.847 "name": "Existed_Raid", 00:13:21.847 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:21.847 "strip_size_kb": 64, 00:13:21.847 "state": "configuring", 00:13:21.847 "raid_level": "raid0", 00:13:21.847 "superblock": true, 00:13:21.847 "num_base_bdevs": 3, 00:13:21.847 "num_base_bdevs_discovered": 1, 00:13:21.847 "num_base_bdevs_operational": 3, 00:13:21.847 "base_bdevs_list": [ 00:13:21.847 { 00:13:21.847 "name": "BaseBdev1", 00:13:21.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.847 "is_configured": false, 00:13:21.847 "data_offset": 0, 00:13:21.847 "data_size": 0 00:13:21.847 }, 00:13:21.847 { 00:13:21.847 "name": null, 00:13:21.847 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:21.847 "is_configured": false, 00:13:21.847 "data_offset": 2048, 00:13:21.847 "data_size": 63488 00:13:21.847 }, 00:13:21.847 { 00:13:21.847 "name": "BaseBdev3", 00:13:21.847 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:21.847 "is_configured": true, 00:13:21.847 "data_offset": 2048, 00:13:21.847 "data_size": 63488 00:13:21.847 } 00:13:21.847 ] 00:13:21.847 }' 00:13:21.847 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.847 13:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:22.415 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.415 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:22.675 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:22.675 13:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:22.934 [2024-07-15 13:32:02.229015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:22.934 BaseBdev1 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:22.934 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:23.192 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:23.451 [ 00:13:23.451 { 00:13:23.451 "name": "BaseBdev1", 00:13:23.451 "aliases": [ 00:13:23.451 "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7" 00:13:23.451 ], 00:13:23.451 "product_name": "Malloc disk", 00:13:23.451 "block_size": 512, 00:13:23.451 "num_blocks": 65536, 00:13:23.451 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:23.451 "assigned_rate_limits": { 00:13:23.451 "rw_ios_per_sec": 0, 00:13:23.451 "rw_mbytes_per_sec": 0, 00:13:23.451 "r_mbytes_per_sec": 0, 00:13:23.451 "w_mbytes_per_sec": 0 00:13:23.451 }, 00:13:23.451 "claimed": true, 00:13:23.451 "claim_type": "exclusive_write", 00:13:23.451 "zoned": false, 00:13:23.451 "supported_io_types": { 00:13:23.451 "read": true, 00:13:23.451 "write": true, 00:13:23.451 "unmap": true, 00:13:23.451 "flush": true, 00:13:23.451 "reset": true, 00:13:23.451 "nvme_admin": false, 00:13:23.451 "nvme_io": false, 00:13:23.451 "nvme_io_md": false, 00:13:23.451 "write_zeroes": true, 00:13:23.451 "zcopy": true, 00:13:23.451 "get_zone_info": false, 00:13:23.451 "zone_management": false, 00:13:23.451 "zone_append": false, 00:13:23.451 "compare": false, 00:13:23.451 "compare_and_write": false, 00:13:23.451 "abort": true, 00:13:23.451 "seek_hole": false, 00:13:23.451 "seek_data": false, 00:13:23.451 "copy": true, 00:13:23.451 "nvme_iov_md": false 00:13:23.451 }, 00:13:23.451 "memory_domains": [ 00:13:23.451 { 00:13:23.451 "dma_device_id": "system", 00:13:23.451 "dma_device_type": 1 00:13:23.451 }, 00:13:23.451 { 00:13:23.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.451 "dma_device_type": 2 00:13:23.451 } 00:13:23.451 ], 00:13:23.451 "driver_specific": {} 00:13:23.451 } 00:13:23.451 ] 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.451 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.710 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.710 "name": "Existed_Raid", 00:13:23.710 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:23.710 "strip_size_kb": 64, 00:13:23.710 "state": "configuring", 00:13:23.710 "raid_level": "raid0", 00:13:23.710 "superblock": true, 00:13:23.710 "num_base_bdevs": 3, 00:13:23.710 "num_base_bdevs_discovered": 2, 00:13:23.710 "num_base_bdevs_operational": 3, 00:13:23.710 "base_bdevs_list": [ 00:13:23.710 { 00:13:23.710 "name": "BaseBdev1", 00:13:23.710 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:23.710 "is_configured": true, 00:13:23.710 "data_offset": 2048, 00:13:23.710 "data_size": 63488 00:13:23.710 }, 00:13:23.710 { 00:13:23.710 "name": null, 00:13:23.710 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:23.710 "is_configured": false, 00:13:23.710 "data_offset": 2048, 00:13:23.710 "data_size": 63488 00:13:23.710 }, 00:13:23.710 { 00:13:23.710 "name": "BaseBdev3", 00:13:23.710 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:23.710 "is_configured": true, 00:13:23.710 "data_offset": 2048, 00:13:23.710 "data_size": 63488 00:13:23.710 } 00:13:23.710 ] 00:13:23.710 }' 00:13:23.710 13:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.710 13:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:24.287 13:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.287 13:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:24.546 13:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:24.546 13:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:24.805 [2024-07-15 13:32:04.026002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.805 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.063 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.063 "name": "Existed_Raid", 00:13:25.063 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:25.063 "strip_size_kb": 64, 00:13:25.063 "state": "configuring", 00:13:25.063 "raid_level": "raid0", 00:13:25.063 "superblock": true, 00:13:25.063 "num_base_bdevs": 3, 00:13:25.063 "num_base_bdevs_discovered": 1, 00:13:25.063 "num_base_bdevs_operational": 3, 00:13:25.063 "base_bdevs_list": [ 00:13:25.063 { 00:13:25.063 "name": "BaseBdev1", 00:13:25.063 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:25.063 "is_configured": true, 00:13:25.063 "data_offset": 2048, 00:13:25.063 "data_size": 63488 00:13:25.063 }, 00:13:25.063 { 00:13:25.063 "name": null, 00:13:25.063 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:25.063 "is_configured": false, 00:13:25.063 "data_offset": 2048, 00:13:25.063 "data_size": 63488 00:13:25.063 }, 00:13:25.063 { 00:13:25.063 "name": null, 00:13:25.063 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:25.063 "is_configured": false, 00:13:25.063 "data_offset": 2048, 00:13:25.063 "data_size": 63488 00:13:25.063 } 00:13:25.063 ] 00:13:25.063 }' 00:13:25.063 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.064 13:32:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:25.630 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.630 13:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:25.889 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:25.889 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:26.153 [2024-07-15 13:32:05.357544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.153 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.411 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.411 "name": "Existed_Raid", 00:13:26.411 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:26.411 "strip_size_kb": 64, 00:13:26.411 "state": "configuring", 00:13:26.411 "raid_level": "raid0", 00:13:26.411 "superblock": true, 00:13:26.411 "num_base_bdevs": 3, 00:13:26.411 "num_base_bdevs_discovered": 2, 00:13:26.411 "num_base_bdevs_operational": 3, 00:13:26.411 "base_bdevs_list": [ 00:13:26.411 { 00:13:26.411 "name": "BaseBdev1", 00:13:26.411 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:26.411 "is_configured": true, 00:13:26.411 "data_offset": 2048, 00:13:26.411 "data_size": 63488 00:13:26.411 }, 00:13:26.411 { 00:13:26.411 "name": null, 00:13:26.411 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:26.411 "is_configured": false, 00:13:26.411 "data_offset": 2048, 00:13:26.411 "data_size": 63488 00:13:26.411 }, 00:13:26.411 { 00:13:26.411 "name": "BaseBdev3", 00:13:26.412 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:26.412 "is_configured": true, 00:13:26.412 "data_offset": 2048, 00:13:26.412 "data_size": 63488 00:13:26.412 } 00:13:26.412 ] 00:13:26.412 }' 00:13:26.412 13:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.412 13:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:26.977 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:26.977 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.235 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:27.235 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:27.564 [2024-07-15 13:32:06.701124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.564 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:27.821 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.821 "name": "Existed_Raid", 00:13:27.821 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:27.821 "strip_size_kb": 64, 00:13:27.821 "state": "configuring", 00:13:27.821 "raid_level": "raid0", 00:13:27.821 "superblock": true, 00:13:27.821 "num_base_bdevs": 3, 00:13:27.821 "num_base_bdevs_discovered": 1, 00:13:27.821 "num_base_bdevs_operational": 3, 00:13:27.821 "base_bdevs_list": [ 00:13:27.821 { 00:13:27.821 "name": null, 00:13:27.821 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:27.821 "is_configured": false, 00:13:27.821 "data_offset": 2048, 00:13:27.821 "data_size": 63488 00:13:27.821 }, 00:13:27.821 { 00:13:27.821 "name": null, 00:13:27.821 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:27.821 "is_configured": false, 00:13:27.821 "data_offset": 2048, 00:13:27.821 "data_size": 63488 00:13:27.821 }, 00:13:27.821 { 00:13:27.821 "name": "BaseBdev3", 00:13:27.821 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:27.821 "is_configured": true, 00:13:27.821 "data_offset": 2048, 00:13:27.821 "data_size": 63488 00:13:27.821 } 00:13:27.821 ] 00:13:27.821 }' 00:13:27.821 13:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.821 13:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.387 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.387 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:28.387 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:28.387 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:28.646 [2024-07-15 13:32:07.976641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.646 13:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.646 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.646 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.904 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.904 "name": "Existed_Raid", 00:13:28.904 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:28.904 "strip_size_kb": 64, 00:13:28.904 "state": "configuring", 00:13:28.904 "raid_level": "raid0", 00:13:28.904 "superblock": true, 00:13:28.904 "num_base_bdevs": 3, 00:13:28.904 "num_base_bdevs_discovered": 2, 00:13:28.905 "num_base_bdevs_operational": 3, 00:13:28.905 "base_bdevs_list": [ 00:13:28.905 { 00:13:28.905 "name": null, 00:13:28.905 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:28.905 "is_configured": false, 00:13:28.905 "data_offset": 2048, 00:13:28.905 "data_size": 63488 00:13:28.905 }, 00:13:28.905 { 00:13:28.905 "name": "BaseBdev2", 00:13:28.905 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:28.905 "is_configured": true, 00:13:28.905 "data_offset": 2048, 00:13:28.905 "data_size": 63488 00:13:28.905 }, 00:13:28.905 { 00:13:28.905 "name": "BaseBdev3", 00:13:28.905 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:28.905 "is_configured": true, 00:13:28.905 "data_offset": 2048, 00:13:28.905 "data_size": 63488 00:13:28.905 } 00:13:28.905 ] 00:13:28.905 }' 00:13:28.905 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.905 13:32:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.470 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.471 13:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:29.729 13:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:29.729 13:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.729 13:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:29.987 13:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7 00:13:30.247 [2024-07-15 13:32:09.561353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:30.247 [2024-07-15 13:32:09.561507] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8c6e90 00:13:30.247 [2024-07-15 13:32:09.561521] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:30.247 [2024-07-15 13:32:09.561700] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x5cd940 00:13:30.247 [2024-07-15 13:32:09.561810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8c6e90 00:13:30.247 [2024-07-15 13:32:09.561820] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8c6e90 00:13:30.247 [2024-07-15 13:32:09.561909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.247 NewBaseBdev 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:30.247 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.506 13:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:30.764 [ 00:13:30.764 { 00:13:30.764 "name": "NewBaseBdev", 00:13:30.764 "aliases": [ 00:13:30.764 "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7" 00:13:30.764 ], 00:13:30.764 "product_name": "Malloc disk", 00:13:30.764 "block_size": 512, 00:13:30.764 "num_blocks": 65536, 00:13:30.764 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:30.764 "assigned_rate_limits": { 00:13:30.764 "rw_ios_per_sec": 0, 00:13:30.764 "rw_mbytes_per_sec": 0, 00:13:30.764 "r_mbytes_per_sec": 0, 00:13:30.764 "w_mbytes_per_sec": 0 00:13:30.764 }, 00:13:30.764 "claimed": true, 00:13:30.764 "claim_type": "exclusive_write", 00:13:30.764 "zoned": false, 00:13:30.764 "supported_io_types": { 00:13:30.764 "read": true, 00:13:30.764 "write": true, 00:13:30.764 "unmap": true, 00:13:30.764 "flush": true, 00:13:30.764 "reset": true, 00:13:30.764 "nvme_admin": false, 00:13:30.764 "nvme_io": false, 00:13:30.764 "nvme_io_md": false, 00:13:30.764 "write_zeroes": true, 00:13:30.764 "zcopy": true, 00:13:30.764 "get_zone_info": false, 00:13:30.764 "zone_management": false, 00:13:30.764 "zone_append": false, 00:13:30.764 "compare": false, 00:13:30.764 "compare_and_write": false, 00:13:30.764 "abort": true, 00:13:30.764 "seek_hole": false, 00:13:30.764 "seek_data": false, 00:13:30.764 "copy": true, 00:13:30.764 "nvme_iov_md": false 00:13:30.764 }, 00:13:30.764 "memory_domains": [ 00:13:30.764 { 00:13:30.764 "dma_device_id": "system", 00:13:30.764 "dma_device_type": 1 00:13:30.764 }, 00:13:30.764 { 00:13:30.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.764 "dma_device_type": 2 00:13:30.764 } 00:13:30.764 ], 00:13:30.764 "driver_specific": {} 00:13:30.764 } 00:13:30.764 ] 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.764 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.024 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.024 "name": "Existed_Raid", 00:13:31.024 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:31.024 "strip_size_kb": 64, 00:13:31.024 "state": "online", 00:13:31.024 "raid_level": "raid0", 00:13:31.024 "superblock": true, 00:13:31.024 "num_base_bdevs": 3, 00:13:31.024 "num_base_bdevs_discovered": 3, 00:13:31.024 "num_base_bdevs_operational": 3, 00:13:31.024 "base_bdevs_list": [ 00:13:31.024 { 00:13:31.024 "name": "NewBaseBdev", 00:13:31.024 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:31.024 "is_configured": true, 00:13:31.024 "data_offset": 2048, 00:13:31.024 "data_size": 63488 00:13:31.024 }, 00:13:31.024 { 00:13:31.024 "name": "BaseBdev2", 00:13:31.024 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:31.024 "is_configured": true, 00:13:31.024 "data_offset": 2048, 00:13:31.024 "data_size": 63488 00:13:31.024 }, 00:13:31.024 { 00:13:31.024 "name": "BaseBdev3", 00:13:31.024 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:31.024 "is_configured": true, 00:13:31.024 "data_offset": 2048, 00:13:31.024 "data_size": 63488 00:13:31.024 } 00:13:31.024 ] 00:13:31.024 }' 00:13:31.024 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.024 13:32:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:31.591 13:32:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:31.849 [2024-07-15 13:32:11.081702] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:31.849 "name": "Existed_Raid", 00:13:31.849 "aliases": [ 00:13:31.849 "90d45be9-02b4-4e3e-83e0-3955007629ba" 00:13:31.849 ], 00:13:31.849 "product_name": "Raid Volume", 00:13:31.849 "block_size": 512, 00:13:31.849 "num_blocks": 190464, 00:13:31.849 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:31.849 "assigned_rate_limits": { 00:13:31.849 "rw_ios_per_sec": 0, 00:13:31.849 "rw_mbytes_per_sec": 0, 00:13:31.849 "r_mbytes_per_sec": 0, 00:13:31.849 "w_mbytes_per_sec": 0 00:13:31.849 }, 00:13:31.849 "claimed": false, 00:13:31.849 "zoned": false, 00:13:31.849 "supported_io_types": { 00:13:31.849 "read": true, 00:13:31.849 "write": true, 00:13:31.849 "unmap": true, 00:13:31.849 "flush": true, 00:13:31.849 "reset": true, 00:13:31.849 "nvme_admin": false, 00:13:31.849 "nvme_io": false, 00:13:31.849 "nvme_io_md": false, 00:13:31.849 "write_zeroes": true, 00:13:31.849 "zcopy": false, 00:13:31.849 "get_zone_info": false, 00:13:31.849 "zone_management": false, 00:13:31.849 "zone_append": false, 00:13:31.849 "compare": false, 00:13:31.849 "compare_and_write": false, 00:13:31.849 "abort": false, 00:13:31.849 "seek_hole": false, 00:13:31.849 "seek_data": false, 00:13:31.849 "copy": false, 00:13:31.849 "nvme_iov_md": false 00:13:31.849 }, 00:13:31.849 "memory_domains": [ 00:13:31.849 { 00:13:31.849 "dma_device_id": "system", 00:13:31.849 "dma_device_type": 1 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.849 "dma_device_type": 2 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "dma_device_id": "system", 00:13:31.849 "dma_device_type": 1 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.849 "dma_device_type": 2 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "dma_device_id": "system", 00:13:31.849 "dma_device_type": 1 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.849 "dma_device_type": 2 00:13:31.849 } 00:13:31.849 ], 00:13:31.849 "driver_specific": { 00:13:31.849 "raid": { 00:13:31.849 "uuid": "90d45be9-02b4-4e3e-83e0-3955007629ba", 00:13:31.849 "strip_size_kb": 64, 00:13:31.849 "state": "online", 00:13:31.849 "raid_level": "raid0", 00:13:31.849 "superblock": true, 00:13:31.849 "num_base_bdevs": 3, 00:13:31.849 "num_base_bdevs_discovered": 3, 00:13:31.849 "num_base_bdevs_operational": 3, 00:13:31.849 "base_bdevs_list": [ 00:13:31.849 { 00:13:31.849 "name": "NewBaseBdev", 00:13:31.849 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:31.849 "is_configured": true, 00:13:31.849 "data_offset": 2048, 00:13:31.849 "data_size": 63488 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "name": "BaseBdev2", 00:13:31.849 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:31.849 "is_configured": true, 00:13:31.849 "data_offset": 2048, 00:13:31.849 "data_size": 63488 00:13:31.849 }, 00:13:31.849 { 00:13:31.849 "name": "BaseBdev3", 00:13:31.849 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:31.849 "is_configured": true, 00:13:31.849 "data_offset": 2048, 00:13:31.849 "data_size": 63488 00:13:31.849 } 00:13:31.849 ] 00:13:31.849 } 00:13:31.849 } 00:13:31.849 }' 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:31.849 BaseBdev2 00:13:31.849 BaseBdev3' 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:31.849 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.108 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.108 "name": "NewBaseBdev", 00:13:32.108 "aliases": [ 00:13:32.108 "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7" 00:13:32.108 ], 00:13:32.108 "product_name": "Malloc disk", 00:13:32.108 "block_size": 512, 00:13:32.108 "num_blocks": 65536, 00:13:32.108 "uuid": "94b981fa-e9a1-4eb9-8ec2-8974b15e1fa7", 00:13:32.108 "assigned_rate_limits": { 00:13:32.108 "rw_ios_per_sec": 0, 00:13:32.108 "rw_mbytes_per_sec": 0, 00:13:32.108 "r_mbytes_per_sec": 0, 00:13:32.108 "w_mbytes_per_sec": 0 00:13:32.108 }, 00:13:32.108 "claimed": true, 00:13:32.108 "claim_type": "exclusive_write", 00:13:32.108 "zoned": false, 00:13:32.108 "supported_io_types": { 00:13:32.108 "read": true, 00:13:32.108 "write": true, 00:13:32.108 "unmap": true, 00:13:32.108 "flush": true, 00:13:32.108 "reset": true, 00:13:32.108 "nvme_admin": false, 00:13:32.108 "nvme_io": false, 00:13:32.108 "nvme_io_md": false, 00:13:32.108 "write_zeroes": true, 00:13:32.108 "zcopy": true, 00:13:32.108 "get_zone_info": false, 00:13:32.108 "zone_management": false, 00:13:32.108 "zone_append": false, 00:13:32.108 "compare": false, 00:13:32.108 "compare_and_write": false, 00:13:32.108 "abort": true, 00:13:32.108 "seek_hole": false, 00:13:32.108 "seek_data": false, 00:13:32.108 "copy": true, 00:13:32.108 "nvme_iov_md": false 00:13:32.108 }, 00:13:32.108 "memory_domains": [ 00:13:32.108 { 00:13:32.108 "dma_device_id": "system", 00:13:32.108 "dma_device_type": 1 00:13:32.108 }, 00:13:32.108 { 00:13:32.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.108 "dma_device_type": 2 00:13:32.108 } 00:13:32.108 ], 00:13:32.108 "driver_specific": {} 00:13:32.108 }' 00:13:32.108 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.108 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.108 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.108 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.366 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.366 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.366 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:32.367 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.626 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.626 "name": "BaseBdev2", 00:13:32.626 "aliases": [ 00:13:32.626 "e3e0db57-4961-4145-86b4-b9c4f361043e" 00:13:32.626 ], 00:13:32.626 "product_name": "Malloc disk", 00:13:32.626 "block_size": 512, 00:13:32.626 "num_blocks": 65536, 00:13:32.626 "uuid": "e3e0db57-4961-4145-86b4-b9c4f361043e", 00:13:32.626 "assigned_rate_limits": { 00:13:32.626 "rw_ios_per_sec": 0, 00:13:32.626 "rw_mbytes_per_sec": 0, 00:13:32.626 "r_mbytes_per_sec": 0, 00:13:32.626 "w_mbytes_per_sec": 0 00:13:32.626 }, 00:13:32.626 "claimed": true, 00:13:32.626 "claim_type": "exclusive_write", 00:13:32.626 "zoned": false, 00:13:32.626 "supported_io_types": { 00:13:32.626 "read": true, 00:13:32.626 "write": true, 00:13:32.626 "unmap": true, 00:13:32.626 "flush": true, 00:13:32.626 "reset": true, 00:13:32.626 "nvme_admin": false, 00:13:32.626 "nvme_io": false, 00:13:32.626 "nvme_io_md": false, 00:13:32.626 "write_zeroes": true, 00:13:32.626 "zcopy": true, 00:13:32.626 "get_zone_info": false, 00:13:32.626 "zone_management": false, 00:13:32.626 "zone_append": false, 00:13:32.626 "compare": false, 00:13:32.626 "compare_and_write": false, 00:13:32.626 "abort": true, 00:13:32.626 "seek_hole": false, 00:13:32.626 "seek_data": false, 00:13:32.626 "copy": true, 00:13:32.626 "nvme_iov_md": false 00:13:32.626 }, 00:13:32.626 "memory_domains": [ 00:13:32.626 { 00:13:32.626 "dma_device_id": "system", 00:13:32.626 "dma_device_type": 1 00:13:32.626 }, 00:13:32.626 { 00:13:32.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.626 "dma_device_type": 2 00:13:32.626 } 00:13:32.626 ], 00:13:32.626 "driver_specific": {} 00:13:32.626 }' 00:13:32.626 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.626 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.626 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.626 13:32:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.626 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.626 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.926 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:32.927 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:33.186 "name": "BaseBdev3", 00:13:33.186 "aliases": [ 00:13:33.186 "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5" 00:13:33.186 ], 00:13:33.186 "product_name": "Malloc disk", 00:13:33.186 "block_size": 512, 00:13:33.186 "num_blocks": 65536, 00:13:33.186 "uuid": "658f3405-b1e5-4e08-ba1a-0bb0b0e676c5", 00:13:33.186 "assigned_rate_limits": { 00:13:33.186 "rw_ios_per_sec": 0, 00:13:33.186 "rw_mbytes_per_sec": 0, 00:13:33.186 "r_mbytes_per_sec": 0, 00:13:33.186 "w_mbytes_per_sec": 0 00:13:33.186 }, 00:13:33.186 "claimed": true, 00:13:33.186 "claim_type": "exclusive_write", 00:13:33.186 "zoned": false, 00:13:33.186 "supported_io_types": { 00:13:33.186 "read": true, 00:13:33.186 "write": true, 00:13:33.186 "unmap": true, 00:13:33.186 "flush": true, 00:13:33.186 "reset": true, 00:13:33.186 "nvme_admin": false, 00:13:33.186 "nvme_io": false, 00:13:33.186 "nvme_io_md": false, 00:13:33.186 "write_zeroes": true, 00:13:33.186 "zcopy": true, 00:13:33.186 "get_zone_info": false, 00:13:33.186 "zone_management": false, 00:13:33.186 "zone_append": false, 00:13:33.186 "compare": false, 00:13:33.186 "compare_and_write": false, 00:13:33.186 "abort": true, 00:13:33.186 "seek_hole": false, 00:13:33.186 "seek_data": false, 00:13:33.186 "copy": true, 00:13:33.186 "nvme_iov_md": false 00:13:33.186 }, 00:13:33.186 "memory_domains": [ 00:13:33.186 { 00:13:33.186 "dma_device_id": "system", 00:13:33.186 "dma_device_type": 1 00:13:33.186 }, 00:13:33.186 { 00:13:33.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.186 "dma_device_type": 2 00:13:33.186 } 00:13:33.186 ], 00:13:33.186 "driver_specific": {} 00:13:33.186 }' 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.186 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.444 13:32:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:33.703 [2024-07-15 13:32:13.054673] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:33.703 [2024-07-15 13:32:13.054699] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:33.703 [2024-07-15 13:32:13.054749] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.703 [2024-07-15 13:32:13.054800] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.703 [2024-07-15 13:32:13.054811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8c6e90 name Existed_Raid, state offline 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2091738 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2091738 ']' 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2091738 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2091738 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2091738' 00:13:33.703 killing process with pid 2091738 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2091738 00:13:33.703 [2024-07-15 13:32:13.125083] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:33.703 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2091738 00:13:33.962 [2024-07-15 13:32:13.152414] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:33.962 13:32:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:33.962 00:13:33.962 real 0m28.921s 00:13:33.962 user 0m53.051s 00:13:33.962 sys 0m5.126s 00:13:33.962 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:33.962 13:32:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.962 ************************************ 00:13:33.962 END TEST raid_state_function_test_sb 00:13:33.962 ************************************ 00:13:34.221 13:32:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:34.221 13:32:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:13:34.221 13:32:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:34.221 13:32:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:34.221 13:32:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:34.221 ************************************ 00:13:34.221 START TEST raid_superblock_test 00:13:34.221 ************************************ 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2096196 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2096196 /var/tmp/spdk-raid.sock 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2096196 ']' 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:34.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:34.221 13:32:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.221 [2024-07-15 13:32:13.525085] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:13:34.221 [2024-07-15 13:32:13.525158] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2096196 ] 00:13:34.479 [2024-07-15 13:32:13.653799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:34.480 [2024-07-15 13:32:13.757547] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:34.480 [2024-07-15 13:32:13.820038] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:34.480 [2024-07-15 13:32:13.820066] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:35.047 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:35.305 malloc1 00:13:35.305 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:35.565 [2024-07-15 13:32:14.865070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:35.565 [2024-07-15 13:32:14.865123] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.565 [2024-07-15 13:32:14.865141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f81570 00:13:35.565 [2024-07-15 13:32:14.865154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.565 [2024-07-15 13:32:14.866699] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.565 [2024-07-15 13:32:14.866729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:35.565 pt1 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:35.565 13:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:35.823 malloc2 00:13:35.823 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:36.082 [2024-07-15 13:32:15.359258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:36.082 [2024-07-15 13:32:15.359309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:36.082 [2024-07-15 13:32:15.359326] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f82970 00:13:36.082 [2024-07-15 13:32:15.359339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:36.082 [2024-07-15 13:32:15.360884] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:36.082 [2024-07-15 13:32:15.360919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:36.082 pt2 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:36.082 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:36.340 malloc3 00:13:36.340 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:36.599 [2024-07-15 13:32:15.857195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:36.599 [2024-07-15 13:32:15.857240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:36.599 [2024-07-15 13:32:15.857256] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2119340 00:13:36.599 [2024-07-15 13:32:15.857269] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:36.599 [2024-07-15 13:32:15.858637] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:36.599 [2024-07-15 13:32:15.858664] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:36.599 pt3 00:13:36.599 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:36.599 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:36.599 13:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:36.857 [2024-07-15 13:32:16.101861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:36.857 [2024-07-15 13:32:16.103102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:36.857 [2024-07-15 13:32:16.103155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:36.857 [2024-07-15 13:32:16.103301] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f79ea0 00:13:36.857 [2024-07-15 13:32:16.103312] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:36.857 [2024-07-15 13:32:16.103500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f81240 00:13:36.857 [2024-07-15 13:32:16.103638] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f79ea0 00:13:36.857 [2024-07-15 13:32:16.103648] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f79ea0 00:13:36.857 [2024-07-15 13:32:16.103741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.857 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:37.115 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.115 "name": "raid_bdev1", 00:13:37.115 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:37.115 "strip_size_kb": 64, 00:13:37.115 "state": "online", 00:13:37.115 "raid_level": "raid0", 00:13:37.115 "superblock": true, 00:13:37.115 "num_base_bdevs": 3, 00:13:37.115 "num_base_bdevs_discovered": 3, 00:13:37.115 "num_base_bdevs_operational": 3, 00:13:37.115 "base_bdevs_list": [ 00:13:37.115 { 00:13:37.115 "name": "pt1", 00:13:37.115 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:37.115 "is_configured": true, 00:13:37.115 "data_offset": 2048, 00:13:37.115 "data_size": 63488 00:13:37.115 }, 00:13:37.115 { 00:13:37.115 "name": "pt2", 00:13:37.115 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:37.115 "is_configured": true, 00:13:37.115 "data_offset": 2048, 00:13:37.115 "data_size": 63488 00:13:37.115 }, 00:13:37.115 { 00:13:37.115 "name": "pt3", 00:13:37.115 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:37.115 "is_configured": true, 00:13:37.115 "data_offset": 2048, 00:13:37.115 "data_size": 63488 00:13:37.115 } 00:13:37.115 ] 00:13:37.115 }' 00:13:37.115 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.115 13:32:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.680 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:37.681 13:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:37.938 [2024-07-15 13:32:17.197031] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:37.938 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:37.938 "name": "raid_bdev1", 00:13:37.938 "aliases": [ 00:13:37.938 "257c2dca-6f2e-4aa7-9e94-3114836f93e4" 00:13:37.938 ], 00:13:37.938 "product_name": "Raid Volume", 00:13:37.938 "block_size": 512, 00:13:37.938 "num_blocks": 190464, 00:13:37.938 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:37.938 "assigned_rate_limits": { 00:13:37.938 "rw_ios_per_sec": 0, 00:13:37.938 "rw_mbytes_per_sec": 0, 00:13:37.938 "r_mbytes_per_sec": 0, 00:13:37.938 "w_mbytes_per_sec": 0 00:13:37.938 }, 00:13:37.938 "claimed": false, 00:13:37.938 "zoned": false, 00:13:37.938 "supported_io_types": { 00:13:37.938 "read": true, 00:13:37.938 "write": true, 00:13:37.938 "unmap": true, 00:13:37.939 "flush": true, 00:13:37.939 "reset": true, 00:13:37.939 "nvme_admin": false, 00:13:37.939 "nvme_io": false, 00:13:37.939 "nvme_io_md": false, 00:13:37.939 "write_zeroes": true, 00:13:37.939 "zcopy": false, 00:13:37.939 "get_zone_info": false, 00:13:37.939 "zone_management": false, 00:13:37.939 "zone_append": false, 00:13:37.939 "compare": false, 00:13:37.939 "compare_and_write": false, 00:13:37.939 "abort": false, 00:13:37.939 "seek_hole": false, 00:13:37.939 "seek_data": false, 00:13:37.939 "copy": false, 00:13:37.939 "nvme_iov_md": false 00:13:37.939 }, 00:13:37.939 "memory_domains": [ 00:13:37.939 { 00:13:37.939 "dma_device_id": "system", 00:13:37.939 "dma_device_type": 1 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.939 "dma_device_type": 2 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "dma_device_id": "system", 00:13:37.939 "dma_device_type": 1 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.939 "dma_device_type": 2 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "dma_device_id": "system", 00:13:37.939 "dma_device_type": 1 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.939 "dma_device_type": 2 00:13:37.939 } 00:13:37.939 ], 00:13:37.939 "driver_specific": { 00:13:37.939 "raid": { 00:13:37.939 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:37.939 "strip_size_kb": 64, 00:13:37.939 "state": "online", 00:13:37.939 "raid_level": "raid0", 00:13:37.939 "superblock": true, 00:13:37.939 "num_base_bdevs": 3, 00:13:37.939 "num_base_bdevs_discovered": 3, 00:13:37.939 "num_base_bdevs_operational": 3, 00:13:37.939 "base_bdevs_list": [ 00:13:37.939 { 00:13:37.939 "name": "pt1", 00:13:37.939 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:37.939 "is_configured": true, 00:13:37.939 "data_offset": 2048, 00:13:37.939 "data_size": 63488 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "name": "pt2", 00:13:37.939 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:37.939 "is_configured": true, 00:13:37.939 "data_offset": 2048, 00:13:37.939 "data_size": 63488 00:13:37.939 }, 00:13:37.939 { 00:13:37.939 "name": "pt3", 00:13:37.939 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:37.939 "is_configured": true, 00:13:37.939 "data_offset": 2048, 00:13:37.939 "data_size": 63488 00:13:37.939 } 00:13:37.939 ] 00:13:37.939 } 00:13:37.939 } 00:13:37.939 }' 00:13:37.939 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:37.939 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:37.939 pt2 00:13:37.939 pt3' 00:13:37.939 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.939 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:37.939 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.196 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.196 "name": "pt1", 00:13:38.196 "aliases": [ 00:13:38.196 "00000000-0000-0000-0000-000000000001" 00:13:38.196 ], 00:13:38.196 "product_name": "passthru", 00:13:38.196 "block_size": 512, 00:13:38.196 "num_blocks": 65536, 00:13:38.196 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:38.196 "assigned_rate_limits": { 00:13:38.196 "rw_ios_per_sec": 0, 00:13:38.196 "rw_mbytes_per_sec": 0, 00:13:38.196 "r_mbytes_per_sec": 0, 00:13:38.196 "w_mbytes_per_sec": 0 00:13:38.196 }, 00:13:38.196 "claimed": true, 00:13:38.196 "claim_type": "exclusive_write", 00:13:38.196 "zoned": false, 00:13:38.196 "supported_io_types": { 00:13:38.196 "read": true, 00:13:38.196 "write": true, 00:13:38.196 "unmap": true, 00:13:38.196 "flush": true, 00:13:38.196 "reset": true, 00:13:38.196 "nvme_admin": false, 00:13:38.196 "nvme_io": false, 00:13:38.196 "nvme_io_md": false, 00:13:38.196 "write_zeroes": true, 00:13:38.196 "zcopy": true, 00:13:38.196 "get_zone_info": false, 00:13:38.196 "zone_management": false, 00:13:38.196 "zone_append": false, 00:13:38.196 "compare": false, 00:13:38.196 "compare_and_write": false, 00:13:38.196 "abort": true, 00:13:38.196 "seek_hole": false, 00:13:38.196 "seek_data": false, 00:13:38.196 "copy": true, 00:13:38.196 "nvme_iov_md": false 00:13:38.196 }, 00:13:38.196 "memory_domains": [ 00:13:38.196 { 00:13:38.196 "dma_device_id": "system", 00:13:38.196 "dma_device_type": 1 00:13:38.196 }, 00:13:38.196 { 00:13:38.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.196 "dma_device_type": 2 00:13:38.196 } 00:13:38.196 ], 00:13:38.196 "driver_specific": { 00:13:38.196 "passthru": { 00:13:38.196 "name": "pt1", 00:13:38.196 "base_bdev_name": "malloc1" 00:13:38.196 } 00:13:38.196 } 00:13:38.196 }' 00:13:38.196 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.196 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.196 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.196 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:38.453 13:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.711 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.711 "name": "pt2", 00:13:38.711 "aliases": [ 00:13:38.711 "00000000-0000-0000-0000-000000000002" 00:13:38.711 ], 00:13:38.711 "product_name": "passthru", 00:13:38.711 "block_size": 512, 00:13:38.711 "num_blocks": 65536, 00:13:38.711 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:38.711 "assigned_rate_limits": { 00:13:38.711 "rw_ios_per_sec": 0, 00:13:38.711 "rw_mbytes_per_sec": 0, 00:13:38.711 "r_mbytes_per_sec": 0, 00:13:38.711 "w_mbytes_per_sec": 0 00:13:38.711 }, 00:13:38.711 "claimed": true, 00:13:38.711 "claim_type": "exclusive_write", 00:13:38.711 "zoned": false, 00:13:38.711 "supported_io_types": { 00:13:38.711 "read": true, 00:13:38.711 "write": true, 00:13:38.711 "unmap": true, 00:13:38.711 "flush": true, 00:13:38.711 "reset": true, 00:13:38.711 "nvme_admin": false, 00:13:38.711 "nvme_io": false, 00:13:38.711 "nvme_io_md": false, 00:13:38.711 "write_zeroes": true, 00:13:38.711 "zcopy": true, 00:13:38.711 "get_zone_info": false, 00:13:38.711 "zone_management": false, 00:13:38.711 "zone_append": false, 00:13:38.711 "compare": false, 00:13:38.711 "compare_and_write": false, 00:13:38.711 "abort": true, 00:13:38.711 "seek_hole": false, 00:13:38.711 "seek_data": false, 00:13:38.711 "copy": true, 00:13:38.711 "nvme_iov_md": false 00:13:38.711 }, 00:13:38.711 "memory_domains": [ 00:13:38.711 { 00:13:38.711 "dma_device_id": "system", 00:13:38.711 "dma_device_type": 1 00:13:38.711 }, 00:13:38.711 { 00:13:38.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.711 "dma_device_type": 2 00:13:38.711 } 00:13:38.711 ], 00:13:38.711 "driver_specific": { 00:13:38.711 "passthru": { 00:13:38.711 "name": "pt2", 00:13:38.711 "base_bdev_name": "malloc2" 00:13:38.711 } 00:13:38.711 } 00:13:38.711 }' 00:13:38.711 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.968 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.226 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.226 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.226 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:39.226 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:39.226 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:39.485 "name": "pt3", 00:13:39.485 "aliases": [ 00:13:39.485 "00000000-0000-0000-0000-000000000003" 00:13:39.485 ], 00:13:39.485 "product_name": "passthru", 00:13:39.485 "block_size": 512, 00:13:39.485 "num_blocks": 65536, 00:13:39.485 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:39.485 "assigned_rate_limits": { 00:13:39.485 "rw_ios_per_sec": 0, 00:13:39.485 "rw_mbytes_per_sec": 0, 00:13:39.485 "r_mbytes_per_sec": 0, 00:13:39.485 "w_mbytes_per_sec": 0 00:13:39.485 }, 00:13:39.485 "claimed": true, 00:13:39.485 "claim_type": "exclusive_write", 00:13:39.485 "zoned": false, 00:13:39.485 "supported_io_types": { 00:13:39.485 "read": true, 00:13:39.485 "write": true, 00:13:39.485 "unmap": true, 00:13:39.485 "flush": true, 00:13:39.485 "reset": true, 00:13:39.485 "nvme_admin": false, 00:13:39.485 "nvme_io": false, 00:13:39.485 "nvme_io_md": false, 00:13:39.485 "write_zeroes": true, 00:13:39.485 "zcopy": true, 00:13:39.485 "get_zone_info": false, 00:13:39.485 "zone_management": false, 00:13:39.485 "zone_append": false, 00:13:39.485 "compare": false, 00:13:39.485 "compare_and_write": false, 00:13:39.485 "abort": true, 00:13:39.485 "seek_hole": false, 00:13:39.485 "seek_data": false, 00:13:39.485 "copy": true, 00:13:39.485 "nvme_iov_md": false 00:13:39.485 }, 00:13:39.485 "memory_domains": [ 00:13:39.485 { 00:13:39.485 "dma_device_id": "system", 00:13:39.485 "dma_device_type": 1 00:13:39.485 }, 00:13:39.485 { 00:13:39.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.485 "dma_device_type": 2 00:13:39.485 } 00:13:39.485 ], 00:13:39.485 "driver_specific": { 00:13:39.485 "passthru": { 00:13:39.485 "name": "pt3", 00:13:39.485 "base_bdev_name": "malloc3" 00:13:39.485 } 00:13:39.485 } 00:13:39.485 }' 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:39.485 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.743 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:39.743 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:39.743 13:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.743 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.743 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.743 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:39.743 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:40.002 [2024-07-15 13:32:19.286609] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:40.002 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=257c2dca-6f2e-4aa7-9e94-3114836f93e4 00:13:40.002 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 257c2dca-6f2e-4aa7-9e94-3114836f93e4 ']' 00:13:40.002 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:40.260 [2024-07-15 13:32:19.534988] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:40.260 [2024-07-15 13:32:19.535016] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.260 [2024-07-15 13:32:19.535075] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.260 [2024-07-15 13:32:19.535130] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:40.260 [2024-07-15 13:32:19.535142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f79ea0 name raid_bdev1, state offline 00:13:40.260 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.260 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:40.518 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:40.518 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:40.518 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:40.518 13:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:40.777 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:40.777 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:41.037 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:41.037 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:41.295 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:41.295 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:41.553 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:41.553 13:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:41.554 13:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:41.812 [2024-07-15 13:32:21.026891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:41.812 [2024-07-15 13:32:21.028307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:41.812 [2024-07-15 13:32:21.028354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:41.812 [2024-07-15 13:32:21.028402] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:41.812 [2024-07-15 13:32:21.028445] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:41.812 [2024-07-15 13:32:21.028468] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:41.812 [2024-07-15 13:32:21.028486] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:41.812 [2024-07-15 13:32:21.028496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2124ff0 name raid_bdev1, state configuring 00:13:41.812 request: 00:13:41.812 { 00:13:41.812 "name": "raid_bdev1", 00:13:41.812 "raid_level": "raid0", 00:13:41.812 "base_bdevs": [ 00:13:41.812 "malloc1", 00:13:41.812 "malloc2", 00:13:41.812 "malloc3" 00:13:41.812 ], 00:13:41.812 "strip_size_kb": 64, 00:13:41.812 "superblock": false, 00:13:41.812 "method": "bdev_raid_create", 00:13:41.812 "req_id": 1 00:13:41.812 } 00:13:41.812 Got JSON-RPC error response 00:13:41.812 response: 00:13:41.812 { 00:13:41.812 "code": -17, 00:13:41.812 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:41.812 } 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.812 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:42.071 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:42.071 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:42.071 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:42.330 [2024-07-15 13:32:21.520132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:42.330 [2024-07-15 13:32:21.520181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:42.330 [2024-07-15 13:32:21.520204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f817a0 00:13:42.330 [2024-07-15 13:32:21.520217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:42.330 [2024-07-15 13:32:21.521833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:42.330 [2024-07-15 13:32:21.521866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:42.330 [2024-07-15 13:32:21.521948] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:42.330 [2024-07-15 13:32:21.521985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:42.330 pt1 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.330 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.589 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.589 "name": "raid_bdev1", 00:13:42.589 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:42.589 "strip_size_kb": 64, 00:13:42.589 "state": "configuring", 00:13:42.589 "raid_level": "raid0", 00:13:42.589 "superblock": true, 00:13:42.589 "num_base_bdevs": 3, 00:13:42.589 "num_base_bdevs_discovered": 1, 00:13:42.589 "num_base_bdevs_operational": 3, 00:13:42.589 "base_bdevs_list": [ 00:13:42.589 { 00:13:42.589 "name": "pt1", 00:13:42.589 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:42.589 "is_configured": true, 00:13:42.589 "data_offset": 2048, 00:13:42.589 "data_size": 63488 00:13:42.589 }, 00:13:42.589 { 00:13:42.589 "name": null, 00:13:42.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:42.589 "is_configured": false, 00:13:42.589 "data_offset": 2048, 00:13:42.589 "data_size": 63488 00:13:42.589 }, 00:13:42.589 { 00:13:42.589 "name": null, 00:13:42.589 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:42.589 "is_configured": false, 00:13:42.589 "data_offset": 2048, 00:13:42.589 "data_size": 63488 00:13:42.589 } 00:13:42.589 ] 00:13:42.589 }' 00:13:42.589 13:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.589 13:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.156 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:43.156 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:43.415 [2024-07-15 13:32:22.611031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:43.415 [2024-07-15 13:32:22.611094] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:43.415 [2024-07-15 13:32:22.611114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f78c70 00:13:43.415 [2024-07-15 13:32:22.611128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:43.415 [2024-07-15 13:32:22.611492] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:43.415 [2024-07-15 13:32:22.611511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:43.415 [2024-07-15 13:32:22.611576] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:43.415 [2024-07-15 13:32:22.611596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:43.415 pt2 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:43.415 [2024-07-15 13:32:22.799539] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.415 13:32:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:43.730 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.730 "name": "raid_bdev1", 00:13:43.730 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:43.730 "strip_size_kb": 64, 00:13:43.730 "state": "configuring", 00:13:43.730 "raid_level": "raid0", 00:13:43.730 "superblock": true, 00:13:43.730 "num_base_bdevs": 3, 00:13:43.730 "num_base_bdevs_discovered": 1, 00:13:43.730 "num_base_bdevs_operational": 3, 00:13:43.730 "base_bdevs_list": [ 00:13:43.730 { 00:13:43.730 "name": "pt1", 00:13:43.730 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:43.730 "is_configured": true, 00:13:43.730 "data_offset": 2048, 00:13:43.730 "data_size": 63488 00:13:43.730 }, 00:13:43.730 { 00:13:43.730 "name": null, 00:13:43.730 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:43.730 "is_configured": false, 00:13:43.730 "data_offset": 2048, 00:13:43.730 "data_size": 63488 00:13:43.730 }, 00:13:43.730 { 00:13:43.730 "name": null, 00:13:43.730 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:43.730 "is_configured": false, 00:13:43.730 "data_offset": 2048, 00:13:43.730 "data_size": 63488 00:13:43.730 } 00:13:43.730 ] 00:13:43.730 }' 00:13:43.730 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.730 13:32:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.326 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:44.326 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:44.326 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:44.585 [2024-07-15 13:32:23.882412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:44.585 [2024-07-15 13:32:23.882466] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:44.585 [2024-07-15 13:32:23.882486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2119fa0 00:13:44.585 [2024-07-15 13:32:23.882504] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:44.585 [2024-07-15 13:32:23.882853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:44.585 [2024-07-15 13:32:23.882872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:44.585 [2024-07-15 13:32:23.882953] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:44.585 [2024-07-15 13:32:23.882974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:44.585 pt2 00:13:44.585 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:44.585 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:44.585 13:32:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:44.844 [2024-07-15 13:32:24.127064] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:44.844 [2024-07-15 13:32:24.127107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:44.844 [2024-07-15 13:32:24.127127] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211ab30 00:13:44.844 [2024-07-15 13:32:24.127139] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:44.844 [2024-07-15 13:32:24.127460] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:44.844 [2024-07-15 13:32:24.127478] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:44.844 [2024-07-15 13:32:24.127536] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:44.844 [2024-07-15 13:32:24.127555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:44.844 [2024-07-15 13:32:24.127662] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x211bc00 00:13:44.844 [2024-07-15 13:32:24.127673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:44.844 [2024-07-15 13:32:24.127838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21249b0 00:13:44.844 [2024-07-15 13:32:24.127976] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x211bc00 00:13:44.844 [2024-07-15 13:32:24.127987] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x211bc00 00:13:44.844 [2024-07-15 13:32:24.128087] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:44.844 pt3 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.844 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:45.102 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.102 "name": "raid_bdev1", 00:13:45.102 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:45.102 "strip_size_kb": 64, 00:13:45.102 "state": "online", 00:13:45.102 "raid_level": "raid0", 00:13:45.102 "superblock": true, 00:13:45.102 "num_base_bdevs": 3, 00:13:45.102 "num_base_bdevs_discovered": 3, 00:13:45.102 "num_base_bdevs_operational": 3, 00:13:45.102 "base_bdevs_list": [ 00:13:45.102 { 00:13:45.102 "name": "pt1", 00:13:45.102 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:45.102 "is_configured": true, 00:13:45.102 "data_offset": 2048, 00:13:45.102 "data_size": 63488 00:13:45.102 }, 00:13:45.102 { 00:13:45.102 "name": "pt2", 00:13:45.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:45.102 "is_configured": true, 00:13:45.102 "data_offset": 2048, 00:13:45.102 "data_size": 63488 00:13:45.102 }, 00:13:45.102 { 00:13:45.102 "name": "pt3", 00:13:45.102 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:45.102 "is_configured": true, 00:13:45.102 "data_offset": 2048, 00:13:45.102 "data_size": 63488 00:13:45.102 } 00:13:45.102 ] 00:13:45.102 }' 00:13:45.102 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.102 13:32:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:45.668 13:32:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:45.927 [2024-07-15 13:32:25.222288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:45.927 "name": "raid_bdev1", 00:13:45.927 "aliases": [ 00:13:45.927 "257c2dca-6f2e-4aa7-9e94-3114836f93e4" 00:13:45.927 ], 00:13:45.927 "product_name": "Raid Volume", 00:13:45.927 "block_size": 512, 00:13:45.927 "num_blocks": 190464, 00:13:45.927 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:45.927 "assigned_rate_limits": { 00:13:45.927 "rw_ios_per_sec": 0, 00:13:45.927 "rw_mbytes_per_sec": 0, 00:13:45.927 "r_mbytes_per_sec": 0, 00:13:45.927 "w_mbytes_per_sec": 0 00:13:45.927 }, 00:13:45.927 "claimed": false, 00:13:45.927 "zoned": false, 00:13:45.927 "supported_io_types": { 00:13:45.927 "read": true, 00:13:45.927 "write": true, 00:13:45.927 "unmap": true, 00:13:45.927 "flush": true, 00:13:45.927 "reset": true, 00:13:45.927 "nvme_admin": false, 00:13:45.927 "nvme_io": false, 00:13:45.927 "nvme_io_md": false, 00:13:45.927 "write_zeroes": true, 00:13:45.927 "zcopy": false, 00:13:45.927 "get_zone_info": false, 00:13:45.927 "zone_management": false, 00:13:45.927 "zone_append": false, 00:13:45.927 "compare": false, 00:13:45.927 "compare_and_write": false, 00:13:45.927 "abort": false, 00:13:45.927 "seek_hole": false, 00:13:45.927 "seek_data": false, 00:13:45.927 "copy": false, 00:13:45.927 "nvme_iov_md": false 00:13:45.927 }, 00:13:45.927 "memory_domains": [ 00:13:45.927 { 00:13:45.927 "dma_device_id": "system", 00:13:45.927 "dma_device_type": 1 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.927 "dma_device_type": 2 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "dma_device_id": "system", 00:13:45.927 "dma_device_type": 1 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.927 "dma_device_type": 2 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "dma_device_id": "system", 00:13:45.927 "dma_device_type": 1 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.927 "dma_device_type": 2 00:13:45.927 } 00:13:45.927 ], 00:13:45.927 "driver_specific": { 00:13:45.927 "raid": { 00:13:45.927 "uuid": "257c2dca-6f2e-4aa7-9e94-3114836f93e4", 00:13:45.927 "strip_size_kb": 64, 00:13:45.927 "state": "online", 00:13:45.927 "raid_level": "raid0", 00:13:45.927 "superblock": true, 00:13:45.927 "num_base_bdevs": 3, 00:13:45.927 "num_base_bdevs_discovered": 3, 00:13:45.927 "num_base_bdevs_operational": 3, 00:13:45.927 "base_bdevs_list": [ 00:13:45.927 { 00:13:45.927 "name": "pt1", 00:13:45.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:45.927 "is_configured": true, 00:13:45.927 "data_offset": 2048, 00:13:45.927 "data_size": 63488 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "name": "pt2", 00:13:45.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:45.927 "is_configured": true, 00:13:45.927 "data_offset": 2048, 00:13:45.927 "data_size": 63488 00:13:45.927 }, 00:13:45.927 { 00:13:45.927 "name": "pt3", 00:13:45.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:45.927 "is_configured": true, 00:13:45.927 "data_offset": 2048, 00:13:45.927 "data_size": 63488 00:13:45.927 } 00:13:45.927 ] 00:13:45.927 } 00:13:45.927 } 00:13:45.927 }' 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:45.927 pt2 00:13:45.927 pt3' 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:45.927 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:46.185 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:46.185 "name": "pt1", 00:13:46.185 "aliases": [ 00:13:46.185 "00000000-0000-0000-0000-000000000001" 00:13:46.185 ], 00:13:46.185 "product_name": "passthru", 00:13:46.185 "block_size": 512, 00:13:46.185 "num_blocks": 65536, 00:13:46.185 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:46.185 "assigned_rate_limits": { 00:13:46.185 "rw_ios_per_sec": 0, 00:13:46.185 "rw_mbytes_per_sec": 0, 00:13:46.185 "r_mbytes_per_sec": 0, 00:13:46.185 "w_mbytes_per_sec": 0 00:13:46.185 }, 00:13:46.185 "claimed": true, 00:13:46.185 "claim_type": "exclusive_write", 00:13:46.185 "zoned": false, 00:13:46.185 "supported_io_types": { 00:13:46.185 "read": true, 00:13:46.185 "write": true, 00:13:46.185 "unmap": true, 00:13:46.185 "flush": true, 00:13:46.185 "reset": true, 00:13:46.185 "nvme_admin": false, 00:13:46.185 "nvme_io": false, 00:13:46.185 "nvme_io_md": false, 00:13:46.185 "write_zeroes": true, 00:13:46.185 "zcopy": true, 00:13:46.185 "get_zone_info": false, 00:13:46.185 "zone_management": false, 00:13:46.185 "zone_append": false, 00:13:46.185 "compare": false, 00:13:46.185 "compare_and_write": false, 00:13:46.185 "abort": true, 00:13:46.185 "seek_hole": false, 00:13:46.185 "seek_data": false, 00:13:46.185 "copy": true, 00:13:46.185 "nvme_iov_md": false 00:13:46.185 }, 00:13:46.185 "memory_domains": [ 00:13:46.185 { 00:13:46.185 "dma_device_id": "system", 00:13:46.185 "dma_device_type": 1 00:13:46.185 }, 00:13:46.185 { 00:13:46.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.185 "dma_device_type": 2 00:13:46.185 } 00:13:46.185 ], 00:13:46.185 "driver_specific": { 00:13:46.185 "passthru": { 00:13:46.185 "name": "pt1", 00:13:46.185 "base_bdev_name": "malloc1" 00:13:46.185 } 00:13:46.185 } 00:13:46.185 }' 00:13:46.185 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.185 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:46.442 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:46.700 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:46.700 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:46.700 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:46.700 13:32:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:46.957 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:46.957 "name": "pt2", 00:13:46.957 "aliases": [ 00:13:46.958 "00000000-0000-0000-0000-000000000002" 00:13:46.958 ], 00:13:46.958 "product_name": "passthru", 00:13:46.958 "block_size": 512, 00:13:46.958 "num_blocks": 65536, 00:13:46.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:46.958 "assigned_rate_limits": { 00:13:46.958 "rw_ios_per_sec": 0, 00:13:46.958 "rw_mbytes_per_sec": 0, 00:13:46.958 "r_mbytes_per_sec": 0, 00:13:46.958 "w_mbytes_per_sec": 0 00:13:46.958 }, 00:13:46.958 "claimed": true, 00:13:46.958 "claim_type": "exclusive_write", 00:13:46.958 "zoned": false, 00:13:46.958 "supported_io_types": { 00:13:46.958 "read": true, 00:13:46.958 "write": true, 00:13:46.958 "unmap": true, 00:13:46.958 "flush": true, 00:13:46.958 "reset": true, 00:13:46.958 "nvme_admin": false, 00:13:46.958 "nvme_io": false, 00:13:46.958 "nvme_io_md": false, 00:13:46.958 "write_zeroes": true, 00:13:46.958 "zcopy": true, 00:13:46.958 "get_zone_info": false, 00:13:46.958 "zone_management": false, 00:13:46.958 "zone_append": false, 00:13:46.958 "compare": false, 00:13:46.958 "compare_and_write": false, 00:13:46.958 "abort": true, 00:13:46.958 "seek_hole": false, 00:13:46.958 "seek_data": false, 00:13:46.958 "copy": true, 00:13:46.958 "nvme_iov_md": false 00:13:46.958 }, 00:13:46.958 "memory_domains": [ 00:13:46.958 { 00:13:46.958 "dma_device_id": "system", 00:13:46.958 "dma_device_type": 1 00:13:46.958 }, 00:13:46.958 { 00:13:46.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.958 "dma_device_type": 2 00:13:46.958 } 00:13:46.958 ], 00:13:46.958 "driver_specific": { 00:13:46.958 "passthru": { 00:13:46.958 "name": "pt2", 00:13:46.958 "base_bdev_name": "malloc2" 00:13:46.958 } 00:13:46.958 } 00:13:46.958 }' 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:46.958 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:47.215 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.473 "name": "pt3", 00:13:47.473 "aliases": [ 00:13:47.473 "00000000-0000-0000-0000-000000000003" 00:13:47.473 ], 00:13:47.473 "product_name": "passthru", 00:13:47.473 "block_size": 512, 00:13:47.473 "num_blocks": 65536, 00:13:47.473 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:47.473 "assigned_rate_limits": { 00:13:47.473 "rw_ios_per_sec": 0, 00:13:47.473 "rw_mbytes_per_sec": 0, 00:13:47.473 "r_mbytes_per_sec": 0, 00:13:47.473 "w_mbytes_per_sec": 0 00:13:47.473 }, 00:13:47.473 "claimed": true, 00:13:47.473 "claim_type": "exclusive_write", 00:13:47.473 "zoned": false, 00:13:47.473 "supported_io_types": { 00:13:47.473 "read": true, 00:13:47.473 "write": true, 00:13:47.473 "unmap": true, 00:13:47.473 "flush": true, 00:13:47.473 "reset": true, 00:13:47.473 "nvme_admin": false, 00:13:47.473 "nvme_io": false, 00:13:47.473 "nvme_io_md": false, 00:13:47.473 "write_zeroes": true, 00:13:47.473 "zcopy": true, 00:13:47.473 "get_zone_info": false, 00:13:47.473 "zone_management": false, 00:13:47.473 "zone_append": false, 00:13:47.473 "compare": false, 00:13:47.473 "compare_and_write": false, 00:13:47.473 "abort": true, 00:13:47.473 "seek_hole": false, 00:13:47.473 "seek_data": false, 00:13:47.473 "copy": true, 00:13:47.473 "nvme_iov_md": false 00:13:47.473 }, 00:13:47.473 "memory_domains": [ 00:13:47.473 { 00:13:47.473 "dma_device_id": "system", 00:13:47.473 "dma_device_type": 1 00:13:47.473 }, 00:13:47.473 { 00:13:47.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.473 "dma_device_type": 2 00:13:47.473 } 00:13:47.473 ], 00:13:47.473 "driver_specific": { 00:13:47.473 "passthru": { 00:13:47.473 "name": "pt3", 00:13:47.473 "base_bdev_name": "malloc3" 00:13:47.473 } 00:13:47.473 } 00:13:47.473 }' 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.473 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.731 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:47.731 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.731 13:32:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:47.731 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:47.988 [2024-07-15 13:32:27.327839] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 257c2dca-6f2e-4aa7-9e94-3114836f93e4 '!=' 257c2dca-6f2e-4aa7-9e94-3114836f93e4 ']' 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2096196 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2096196 ']' 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2096196 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2096196 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2096196' 00:13:47.988 killing process with pid 2096196 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2096196 00:13:47.988 [2024-07-15 13:32:27.377583] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:47.988 [2024-07-15 13:32:27.377637] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:47.988 [2024-07-15 13:32:27.377690] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:47.988 [2024-07-15 13:32:27.377702] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211bc00 name raid_bdev1, state offline 00:13:47.988 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2096196 00:13:47.988 [2024-07-15 13:32:27.403981] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:48.246 13:32:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:48.246 00:13:48.246 real 0m14.151s 00:13:48.246 user 0m25.582s 00:13:48.246 sys 0m2.509s 00:13:48.246 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:48.246 13:32:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.246 ************************************ 00:13:48.246 END TEST raid_superblock_test 00:13:48.246 ************************************ 00:13:48.246 13:32:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:48.246 13:32:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:13:48.246 13:32:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:48.246 13:32:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:48.246 13:32:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:48.504 ************************************ 00:13:48.504 START TEST raid_read_error_test 00:13:48.504 ************************************ 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.kEoWwuV7Yq 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2098255 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2098255 /var/tmp/spdk-raid.sock 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2098255 ']' 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:48.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:48.504 13:32:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.504 [2024-07-15 13:32:27.764262] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:13:48.504 [2024-07-15 13:32:27.764325] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2098255 ] 00:13:48.504 [2024-07-15 13:32:27.890903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:48.762 [2024-07-15 13:32:28.000789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:48.762 [2024-07-15 13:32:28.072277] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:48.762 [2024-07-15 13:32:28.072326] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:49.020 13:32:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:49.020 13:32:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:49.020 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:49.020 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:49.279 BaseBdev1_malloc 00:13:49.279 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:49.537 true 00:13:49.537 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:49.537 [2024-07-15 13:32:28.947752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:49.537 [2024-07-15 13:32:28.947798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:49.537 [2024-07-15 13:32:28.947821] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad30d0 00:13:49.537 [2024-07-15 13:32:28.947834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:49.537 [2024-07-15 13:32:28.949678] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:49.538 [2024-07-15 13:32:28.949710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:49.538 BaseBdev1 00:13:49.796 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:49.796 13:32:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:49.796 BaseBdev2_malloc 00:13:49.796 13:32:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:50.054 true 00:13:50.054 13:32:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:50.313 [2024-07-15 13:32:29.598611] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:50.313 [2024-07-15 13:32:29.598659] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:50.313 [2024-07-15 13:32:29.598683] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad7910 00:13:50.313 [2024-07-15 13:32:29.598696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:50.313 [2024-07-15 13:32:29.600289] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:50.313 [2024-07-15 13:32:29.600321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:50.313 BaseBdev2 00:13:50.313 13:32:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:50.313 13:32:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:50.572 BaseBdev3_malloc 00:13:50.572 13:32:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:50.832 true 00:13:50.832 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:51.091 [2024-07-15 13:32:30.338363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:51.091 [2024-07-15 13:32:30.338404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:51.091 [2024-07-15 13:32:30.338434] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad9bd0 00:13:51.091 [2024-07-15 13:32:30.338447] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:51.091 [2024-07-15 13:32:30.340019] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:51.091 [2024-07-15 13:32:30.340050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:51.091 BaseBdev3 00:13:51.091 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:51.350 [2024-07-15 13:32:30.583040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:51.350 [2024-07-15 13:32:30.584381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:51.350 [2024-07-15 13:32:30.584465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:51.350 [2024-07-15 13:32:30.584676] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xadb280 00:13:51.350 [2024-07-15 13:32:30.584688] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:51.350 [2024-07-15 13:32:30.584886] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xadae20 00:13:51.350 [2024-07-15 13:32:30.585044] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xadb280 00:13:51.350 [2024-07-15 13:32:30.585055] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xadb280 00:13:51.350 [2024-07-15 13:32:30.585159] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:51.350 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.350 "name": "raid_bdev1", 00:13:51.350 "uuid": "01663ae3-6817-421c-957e-45f9240cc9fe", 00:13:51.350 "strip_size_kb": 64, 00:13:51.350 "state": "online", 00:13:51.350 "raid_level": "raid0", 00:13:51.350 "superblock": true, 00:13:51.350 "num_base_bdevs": 3, 00:13:51.350 "num_base_bdevs_discovered": 3, 00:13:51.350 "num_base_bdevs_operational": 3, 00:13:51.350 "base_bdevs_list": [ 00:13:51.350 { 00:13:51.350 "name": "BaseBdev1", 00:13:51.350 "uuid": "48b74f6f-2811-535a-bb2b-1a6b5ffad67d", 00:13:51.350 "is_configured": true, 00:13:51.350 "data_offset": 2048, 00:13:51.350 "data_size": 63488 00:13:51.350 }, 00:13:51.351 { 00:13:51.351 "name": "BaseBdev2", 00:13:51.351 "uuid": "1dd3a8af-50a4-5ebd-a599-fb1a4b6e64bd", 00:13:51.351 "is_configured": true, 00:13:51.351 "data_offset": 2048, 00:13:51.351 "data_size": 63488 00:13:51.351 }, 00:13:51.351 { 00:13:51.351 "name": "BaseBdev3", 00:13:51.351 "uuid": "75b0eeb6-1bf6-540d-9b88-e23347dd75f3", 00:13:51.351 "is_configured": true, 00:13:51.351 "data_offset": 2048, 00:13:51.351 "data_size": 63488 00:13:51.351 } 00:13:51.351 ] 00:13:51.351 }' 00:13:51.351 13:32:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.351 13:32:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.285 13:32:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:52.285 13:32:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:52.285 [2024-07-15 13:32:31.457632] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9295b0 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.222 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:53.480 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.480 "name": "raid_bdev1", 00:13:53.480 "uuid": "01663ae3-6817-421c-957e-45f9240cc9fe", 00:13:53.480 "strip_size_kb": 64, 00:13:53.480 "state": "online", 00:13:53.480 "raid_level": "raid0", 00:13:53.480 "superblock": true, 00:13:53.480 "num_base_bdevs": 3, 00:13:53.480 "num_base_bdevs_discovered": 3, 00:13:53.480 "num_base_bdevs_operational": 3, 00:13:53.480 "base_bdevs_list": [ 00:13:53.480 { 00:13:53.480 "name": "BaseBdev1", 00:13:53.480 "uuid": "48b74f6f-2811-535a-bb2b-1a6b5ffad67d", 00:13:53.480 "is_configured": true, 00:13:53.480 "data_offset": 2048, 00:13:53.480 "data_size": 63488 00:13:53.480 }, 00:13:53.480 { 00:13:53.480 "name": "BaseBdev2", 00:13:53.480 "uuid": "1dd3a8af-50a4-5ebd-a599-fb1a4b6e64bd", 00:13:53.480 "is_configured": true, 00:13:53.480 "data_offset": 2048, 00:13:53.480 "data_size": 63488 00:13:53.480 }, 00:13:53.480 { 00:13:53.480 "name": "BaseBdev3", 00:13:53.480 "uuid": "75b0eeb6-1bf6-540d-9b88-e23347dd75f3", 00:13:53.480 "is_configured": true, 00:13:53.480 "data_offset": 2048, 00:13:53.480 "data_size": 63488 00:13:53.481 } 00:13:53.481 ] 00:13:53.481 }' 00:13:53.481 13:32:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.481 13:32:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.048 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:54.305 [2024-07-15 13:32:33.639266] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:54.305 [2024-07-15 13:32:33.639301] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:54.305 [2024-07-15 13:32:33.642458] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:54.305 [2024-07-15 13:32:33.642495] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:54.305 [2024-07-15 13:32:33.642530] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:54.305 [2024-07-15 13:32:33.642541] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xadb280 name raid_bdev1, state offline 00:13:54.305 0 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2098255 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2098255 ']' 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2098255 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2098255 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2098255' 00:13:54.305 killing process with pid 2098255 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2098255 00:13:54.305 [2024-07-15 13:32:33.710412] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:54.305 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2098255 00:13:54.563 [2024-07-15 13:32:33.731965] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.kEoWwuV7Yq 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:13:54.563 00:13:54.563 real 0m6.280s 00:13:54.563 user 0m10.200s 00:13:54.563 sys 0m1.194s 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:54.563 13:32:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.563 ************************************ 00:13:54.563 END TEST raid_read_error_test 00:13:54.563 ************************************ 00:13:54.822 13:32:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:54.822 13:32:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:13:54.822 13:32:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:54.822 13:32:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:54.822 13:32:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:54.822 ************************************ 00:13:54.822 START TEST raid_write_error_test 00:13:54.822 ************************************ 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2ejZCI6uIg 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2099225 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2099225 /var/tmp/spdk-raid.sock 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2099225 ']' 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:54.822 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:54.822 13:32:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.822 [2024-07-15 13:32:34.123138] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:13:54.822 [2024-07-15 13:32:34.123203] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2099225 ] 00:13:55.080 [2024-07-15 13:32:34.252544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:55.080 [2024-07-15 13:32:34.358714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:55.080 [2024-07-15 13:32:34.425542] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:55.080 [2024-07-15 13:32:34.425572] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:55.647 13:32:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:55.647 13:32:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:55.647 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:55.647 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:55.954 BaseBdev1_malloc 00:13:55.954 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:56.212 true 00:13:56.212 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:56.469 [2024-07-15 13:32:35.767282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:56.469 [2024-07-15 13:32:35.767328] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:56.469 [2024-07-15 13:32:35.767349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd30d0 00:13:56.469 [2024-07-15 13:32:35.767363] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:56.469 [2024-07-15 13:32:35.769255] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:56.469 [2024-07-15 13:32:35.769290] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:56.469 BaseBdev1 00:13:56.469 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:56.469 13:32:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:56.755 BaseBdev2_malloc 00:13:56.755 13:32:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:57.045 true 00:13:57.045 13:32:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:57.303 [2024-07-15 13:32:36.511040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:57.303 [2024-07-15 13:32:36.511086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.303 [2024-07-15 13:32:36.511107] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd7910 00:13:57.303 [2024-07-15 13:32:36.511120] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.303 [2024-07-15 13:32:36.512715] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.303 [2024-07-15 13:32:36.512761] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:57.303 BaseBdev2 00:13:57.303 13:32:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:57.303 13:32:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:57.561 BaseBdev3_malloc 00:13:57.561 13:32:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:57.819 true 00:13:57.819 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:58.077 [2024-07-15 13:32:37.249604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:58.077 [2024-07-15 13:32:37.249648] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:58.077 [2024-07-15 13:32:37.249669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd9bd0 00:13:58.077 [2024-07-15 13:32:37.249682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:58.077 [2024-07-15 13:32:37.251279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:58.077 [2024-07-15 13:32:37.251309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:58.077 BaseBdev3 00:13:58.077 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:58.077 [2024-07-15 13:32:37.422096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:58.077 [2024-07-15 13:32:37.423310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.077 [2024-07-15 13:32:37.423385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:58.077 [2024-07-15 13:32:37.423589] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbdb280 00:13:58.077 [2024-07-15 13:32:37.423600] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:58.077 [2024-07-15 13:32:37.423786] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbdae20 00:13:58.077 [2024-07-15 13:32:37.423939] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbdb280 00:13:58.078 [2024-07-15 13:32:37.423949] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbdb280 00:13:58.078 [2024-07-15 13:32:37.424051] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.078 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:58.336 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.336 "name": "raid_bdev1", 00:13:58.336 "uuid": "2b384229-bc5c-4c7d-bf36-37101bc12d7f", 00:13:58.336 "strip_size_kb": 64, 00:13:58.336 "state": "online", 00:13:58.336 "raid_level": "raid0", 00:13:58.336 "superblock": true, 00:13:58.336 "num_base_bdevs": 3, 00:13:58.336 "num_base_bdevs_discovered": 3, 00:13:58.336 "num_base_bdevs_operational": 3, 00:13:58.336 "base_bdevs_list": [ 00:13:58.336 { 00:13:58.336 "name": "BaseBdev1", 00:13:58.336 "uuid": "63b554ef-0b51-5e69-96ed-4fff6f089a89", 00:13:58.336 "is_configured": true, 00:13:58.336 "data_offset": 2048, 00:13:58.336 "data_size": 63488 00:13:58.336 }, 00:13:58.336 { 00:13:58.336 "name": "BaseBdev2", 00:13:58.336 "uuid": "2397b23f-fb3f-5cf8-b480-99b94160703f", 00:13:58.336 "is_configured": true, 00:13:58.336 "data_offset": 2048, 00:13:58.336 "data_size": 63488 00:13:58.336 }, 00:13:58.336 { 00:13:58.336 "name": "BaseBdev3", 00:13:58.336 "uuid": "ea51c5c2-067d-5b2e-a824-f016f984f8b6", 00:13:58.336 "is_configured": true, 00:13:58.336 "data_offset": 2048, 00:13:58.336 "data_size": 63488 00:13:58.336 } 00:13:58.336 ] 00:13:58.336 }' 00:13:58.336 13:32:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.336 13:32:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.904 13:32:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:58.904 13:32:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:59.162 [2024-07-15 13:32:38.429037] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa295b0 00:14:00.098 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.358 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:00.616 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.616 "name": "raid_bdev1", 00:14:00.616 "uuid": "2b384229-bc5c-4c7d-bf36-37101bc12d7f", 00:14:00.616 "strip_size_kb": 64, 00:14:00.616 "state": "online", 00:14:00.616 "raid_level": "raid0", 00:14:00.616 "superblock": true, 00:14:00.616 "num_base_bdevs": 3, 00:14:00.616 "num_base_bdevs_discovered": 3, 00:14:00.616 "num_base_bdevs_operational": 3, 00:14:00.616 "base_bdevs_list": [ 00:14:00.616 { 00:14:00.616 "name": "BaseBdev1", 00:14:00.616 "uuid": "63b554ef-0b51-5e69-96ed-4fff6f089a89", 00:14:00.616 "is_configured": true, 00:14:00.616 "data_offset": 2048, 00:14:00.616 "data_size": 63488 00:14:00.616 }, 00:14:00.616 { 00:14:00.616 "name": "BaseBdev2", 00:14:00.616 "uuid": "2397b23f-fb3f-5cf8-b480-99b94160703f", 00:14:00.616 "is_configured": true, 00:14:00.616 "data_offset": 2048, 00:14:00.616 "data_size": 63488 00:14:00.616 }, 00:14:00.616 { 00:14:00.616 "name": "BaseBdev3", 00:14:00.616 "uuid": "ea51c5c2-067d-5b2e-a824-f016f984f8b6", 00:14:00.616 "is_configured": true, 00:14:00.616 "data_offset": 2048, 00:14:00.616 "data_size": 63488 00:14:00.616 } 00:14:00.616 ] 00:14:00.616 }' 00:14:00.616 13:32:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.616 13:32:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.184 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:01.184 [2024-07-15 13:32:40.606189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:01.184 [2024-07-15 13:32:40.606227] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:01.184 [2024-07-15 13:32:40.609403] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:01.443 [2024-07-15 13:32:40.609443] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:01.443 [2024-07-15 13:32:40.609479] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:01.443 [2024-07-15 13:32:40.609490] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbdb280 name raid_bdev1, state offline 00:14:01.443 0 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2099225 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2099225 ']' 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2099225 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2099225 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2099225' 00:14:01.443 killing process with pid 2099225 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2099225 00:14:01.443 [2024-07-15 13:32:40.677178] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:01.443 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2099225 00:14:01.443 [2024-07-15 13:32:40.700803] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2ejZCI6uIg 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:14:01.702 00:14:01.702 real 0m6.892s 00:14:01.702 user 0m10.879s 00:14:01.702 sys 0m1.230s 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:01.702 13:32:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.702 ************************************ 00:14:01.702 END TEST raid_write_error_test 00:14:01.702 ************************************ 00:14:01.702 13:32:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:01.702 13:32:40 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:01.702 13:32:40 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:01.702 13:32:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:01.702 13:32:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:01.702 13:32:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:01.702 ************************************ 00:14:01.702 START TEST raid_state_function_test 00:14:01.702 ************************************ 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:01.702 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2100203 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2100203' 00:14:01.703 Process raid pid: 2100203 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2100203 /var/tmp/spdk-raid.sock 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2100203 ']' 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:01.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:01.703 13:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.703 [2024-07-15 13:32:41.103438] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:14:01.703 [2024-07-15 13:32:41.103509] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:01.962 [2024-07-15 13:32:41.237614] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:01.962 [2024-07-15 13:32:41.339842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:02.221 [2024-07-15 13:32:41.404192] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:02.221 [2024-07-15 13:32:41.404228] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:02.786 13:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:02.786 13:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:02.786 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:03.045 [2024-07-15 13:32:42.259194] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:03.045 [2024-07-15 13:32:42.259239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:03.045 [2024-07-15 13:32:42.259250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:03.045 [2024-07-15 13:32:42.259262] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:03.045 [2024-07-15 13:32:42.259271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:03.045 [2024-07-15 13:32:42.259282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.045 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.303 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.303 "name": "Existed_Raid", 00:14:03.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.303 "strip_size_kb": 64, 00:14:03.303 "state": "configuring", 00:14:03.303 "raid_level": "concat", 00:14:03.303 "superblock": false, 00:14:03.303 "num_base_bdevs": 3, 00:14:03.303 "num_base_bdevs_discovered": 0, 00:14:03.303 "num_base_bdevs_operational": 3, 00:14:03.303 "base_bdevs_list": [ 00:14:03.303 { 00:14:03.303 "name": "BaseBdev1", 00:14:03.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.303 "is_configured": false, 00:14:03.303 "data_offset": 0, 00:14:03.303 "data_size": 0 00:14:03.303 }, 00:14:03.303 { 00:14:03.303 "name": "BaseBdev2", 00:14:03.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.303 "is_configured": false, 00:14:03.303 "data_offset": 0, 00:14:03.303 "data_size": 0 00:14:03.303 }, 00:14:03.303 { 00:14:03.303 "name": "BaseBdev3", 00:14:03.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.303 "is_configured": false, 00:14:03.303 "data_offset": 0, 00:14:03.303 "data_size": 0 00:14:03.303 } 00:14:03.303 ] 00:14:03.303 }' 00:14:03.303 13:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.303 13:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.882 13:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:04.141 [2024-07-15 13:32:43.357959] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:04.141 [2024-07-15 13:32:43.357992] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228da80 name Existed_Raid, state configuring 00:14:04.141 13:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:04.400 [2024-07-15 13:32:43.602627] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:04.400 [2024-07-15 13:32:43.602656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:04.400 [2024-07-15 13:32:43.602667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:04.400 [2024-07-15 13:32:43.602679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:04.400 [2024-07-15 13:32:43.602689] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:04.400 [2024-07-15 13:32:43.602701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:04.400 13:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:04.659 [2024-07-15 13:32:43.853207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:04.659 BaseBdev1 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:04.659 13:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.918 13:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:04.918 [ 00:14:04.918 { 00:14:04.918 "name": "BaseBdev1", 00:14:04.918 "aliases": [ 00:14:04.918 "c4aa16af-b71e-4094-849a-eaaea95d55ed" 00:14:04.918 ], 00:14:04.918 "product_name": "Malloc disk", 00:14:04.918 "block_size": 512, 00:14:04.918 "num_blocks": 65536, 00:14:04.918 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:04.918 "assigned_rate_limits": { 00:14:04.918 "rw_ios_per_sec": 0, 00:14:04.918 "rw_mbytes_per_sec": 0, 00:14:04.918 "r_mbytes_per_sec": 0, 00:14:04.918 "w_mbytes_per_sec": 0 00:14:04.918 }, 00:14:04.918 "claimed": true, 00:14:04.918 "claim_type": "exclusive_write", 00:14:04.918 "zoned": false, 00:14:04.918 "supported_io_types": { 00:14:04.918 "read": true, 00:14:04.918 "write": true, 00:14:04.918 "unmap": true, 00:14:04.918 "flush": true, 00:14:04.918 "reset": true, 00:14:04.918 "nvme_admin": false, 00:14:04.918 "nvme_io": false, 00:14:04.918 "nvme_io_md": false, 00:14:04.918 "write_zeroes": true, 00:14:04.918 "zcopy": true, 00:14:04.918 "get_zone_info": false, 00:14:04.918 "zone_management": false, 00:14:04.918 "zone_append": false, 00:14:04.918 "compare": false, 00:14:04.918 "compare_and_write": false, 00:14:04.918 "abort": true, 00:14:04.918 "seek_hole": false, 00:14:04.918 "seek_data": false, 00:14:04.918 "copy": true, 00:14:04.918 "nvme_iov_md": false 00:14:04.918 }, 00:14:04.918 "memory_domains": [ 00:14:04.918 { 00:14:04.918 "dma_device_id": "system", 00:14:04.918 "dma_device_type": 1 00:14:04.918 }, 00:14:04.918 { 00:14:04.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.918 "dma_device_type": 2 00:14:04.918 } 00:14:04.918 ], 00:14:04.918 "driver_specific": {} 00:14:04.918 } 00:14:04.918 ] 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.177 "name": "Existed_Raid", 00:14:05.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.177 "strip_size_kb": 64, 00:14:05.177 "state": "configuring", 00:14:05.177 "raid_level": "concat", 00:14:05.177 "superblock": false, 00:14:05.177 "num_base_bdevs": 3, 00:14:05.177 "num_base_bdevs_discovered": 1, 00:14:05.177 "num_base_bdevs_operational": 3, 00:14:05.177 "base_bdevs_list": [ 00:14:05.177 { 00:14:05.177 "name": "BaseBdev1", 00:14:05.177 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:05.177 "is_configured": true, 00:14:05.177 "data_offset": 0, 00:14:05.177 "data_size": 65536 00:14:05.177 }, 00:14:05.177 { 00:14:05.177 "name": "BaseBdev2", 00:14:05.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.177 "is_configured": false, 00:14:05.177 "data_offset": 0, 00:14:05.177 "data_size": 0 00:14:05.177 }, 00:14:05.177 { 00:14:05.177 "name": "BaseBdev3", 00:14:05.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.177 "is_configured": false, 00:14:05.177 "data_offset": 0, 00:14:05.177 "data_size": 0 00:14:05.177 } 00:14:05.177 ] 00:14:05.177 }' 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.177 13:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.111 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:06.111 [2024-07-15 13:32:45.425386] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:06.111 [2024-07-15 13:32:45.425430] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228d310 name Existed_Raid, state configuring 00:14:06.111 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:06.369 [2024-07-15 13:32:45.593872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:06.369 [2024-07-15 13:32:45.595383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:06.369 [2024-07-15 13:32:45.595420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:06.369 [2024-07-15 13:32:45.595430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:06.369 [2024-07-15 13:32:45.595443] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.369 "name": "Existed_Raid", 00:14:06.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.369 "strip_size_kb": 64, 00:14:06.369 "state": "configuring", 00:14:06.369 "raid_level": "concat", 00:14:06.369 "superblock": false, 00:14:06.369 "num_base_bdevs": 3, 00:14:06.369 "num_base_bdevs_discovered": 1, 00:14:06.369 "num_base_bdevs_operational": 3, 00:14:06.369 "base_bdevs_list": [ 00:14:06.369 { 00:14:06.369 "name": "BaseBdev1", 00:14:06.369 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:06.369 "is_configured": true, 00:14:06.369 "data_offset": 0, 00:14:06.369 "data_size": 65536 00:14:06.369 }, 00:14:06.369 { 00:14:06.369 "name": "BaseBdev2", 00:14:06.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.369 "is_configured": false, 00:14:06.369 "data_offset": 0, 00:14:06.369 "data_size": 0 00:14:06.369 }, 00:14:06.369 { 00:14:06.369 "name": "BaseBdev3", 00:14:06.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.369 "is_configured": false, 00:14:06.369 "data_offset": 0, 00:14:06.369 "data_size": 0 00:14:06.369 } 00:14:06.369 ] 00:14:06.369 }' 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.369 13:32:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.303 13:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:07.561 [2024-07-15 13:32:46.856692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:07.561 BaseBdev2 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:07.561 13:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.819 13:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:08.078 [ 00:14:08.078 { 00:14:08.078 "name": "BaseBdev2", 00:14:08.078 "aliases": [ 00:14:08.078 "aa20c3ab-b0df-4da2-927c-e23546fa0e42" 00:14:08.078 ], 00:14:08.078 "product_name": "Malloc disk", 00:14:08.078 "block_size": 512, 00:14:08.078 "num_blocks": 65536, 00:14:08.078 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:08.078 "assigned_rate_limits": { 00:14:08.078 "rw_ios_per_sec": 0, 00:14:08.078 "rw_mbytes_per_sec": 0, 00:14:08.078 "r_mbytes_per_sec": 0, 00:14:08.078 "w_mbytes_per_sec": 0 00:14:08.078 }, 00:14:08.078 "claimed": true, 00:14:08.078 "claim_type": "exclusive_write", 00:14:08.078 "zoned": false, 00:14:08.078 "supported_io_types": { 00:14:08.078 "read": true, 00:14:08.078 "write": true, 00:14:08.078 "unmap": true, 00:14:08.078 "flush": true, 00:14:08.078 "reset": true, 00:14:08.078 "nvme_admin": false, 00:14:08.078 "nvme_io": false, 00:14:08.078 "nvme_io_md": false, 00:14:08.078 "write_zeroes": true, 00:14:08.078 "zcopy": true, 00:14:08.078 "get_zone_info": false, 00:14:08.078 "zone_management": false, 00:14:08.078 "zone_append": false, 00:14:08.078 "compare": false, 00:14:08.078 "compare_and_write": false, 00:14:08.078 "abort": true, 00:14:08.078 "seek_hole": false, 00:14:08.078 "seek_data": false, 00:14:08.078 "copy": true, 00:14:08.078 "nvme_iov_md": false 00:14:08.078 }, 00:14:08.078 "memory_domains": [ 00:14:08.078 { 00:14:08.078 "dma_device_id": "system", 00:14:08.078 "dma_device_type": 1 00:14:08.078 }, 00:14:08.078 { 00:14:08.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.078 "dma_device_type": 2 00:14:08.078 } 00:14:08.078 ], 00:14:08.078 "driver_specific": {} 00:14:08.078 } 00:14:08.078 ] 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.078 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.336 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.336 "name": "Existed_Raid", 00:14:08.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.336 "strip_size_kb": 64, 00:14:08.336 "state": "configuring", 00:14:08.336 "raid_level": "concat", 00:14:08.336 "superblock": false, 00:14:08.336 "num_base_bdevs": 3, 00:14:08.336 "num_base_bdevs_discovered": 2, 00:14:08.336 "num_base_bdevs_operational": 3, 00:14:08.336 "base_bdevs_list": [ 00:14:08.336 { 00:14:08.336 "name": "BaseBdev1", 00:14:08.336 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:08.336 "is_configured": true, 00:14:08.336 "data_offset": 0, 00:14:08.336 "data_size": 65536 00:14:08.336 }, 00:14:08.336 { 00:14:08.336 "name": "BaseBdev2", 00:14:08.336 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:08.336 "is_configured": true, 00:14:08.336 "data_offset": 0, 00:14:08.336 "data_size": 65536 00:14:08.336 }, 00:14:08.336 { 00:14:08.336 "name": "BaseBdev3", 00:14:08.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.336 "is_configured": false, 00:14:08.336 "data_offset": 0, 00:14:08.336 "data_size": 0 00:14:08.336 } 00:14:08.336 ] 00:14:08.336 }' 00:14:08.336 13:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.336 13:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.902 13:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:09.161 [2024-07-15 13:32:48.436293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:09.161 [2024-07-15 13:32:48.436333] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228e400 00:14:09.161 [2024-07-15 13:32:48.436342] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:09.161 [2024-07-15 13:32:48.436594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x228def0 00:14:09.161 [2024-07-15 13:32:48.436715] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228e400 00:14:09.161 [2024-07-15 13:32:48.436725] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228e400 00:14:09.161 [2024-07-15 13:32:48.436887] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:09.161 BaseBdev3 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:09.161 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:09.727 13:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:09.727 [ 00:14:09.727 { 00:14:09.727 "name": "BaseBdev3", 00:14:09.727 "aliases": [ 00:14:09.727 "42ce30ab-e070-4c7e-b50e-07b837dbe51e" 00:14:09.727 ], 00:14:09.727 "product_name": "Malloc disk", 00:14:09.727 "block_size": 512, 00:14:09.727 "num_blocks": 65536, 00:14:09.727 "uuid": "42ce30ab-e070-4c7e-b50e-07b837dbe51e", 00:14:09.727 "assigned_rate_limits": { 00:14:09.727 "rw_ios_per_sec": 0, 00:14:09.727 "rw_mbytes_per_sec": 0, 00:14:09.727 "r_mbytes_per_sec": 0, 00:14:09.727 "w_mbytes_per_sec": 0 00:14:09.727 }, 00:14:09.727 "claimed": true, 00:14:09.727 "claim_type": "exclusive_write", 00:14:09.727 "zoned": false, 00:14:09.727 "supported_io_types": { 00:14:09.727 "read": true, 00:14:09.727 "write": true, 00:14:09.727 "unmap": true, 00:14:09.727 "flush": true, 00:14:09.727 "reset": true, 00:14:09.727 "nvme_admin": false, 00:14:09.727 "nvme_io": false, 00:14:09.727 "nvme_io_md": false, 00:14:09.727 "write_zeroes": true, 00:14:09.727 "zcopy": true, 00:14:09.727 "get_zone_info": false, 00:14:09.727 "zone_management": false, 00:14:09.727 "zone_append": false, 00:14:09.727 "compare": false, 00:14:09.727 "compare_and_write": false, 00:14:09.727 "abort": true, 00:14:09.727 "seek_hole": false, 00:14:09.727 "seek_data": false, 00:14:09.727 "copy": true, 00:14:09.727 "nvme_iov_md": false 00:14:09.727 }, 00:14:09.727 "memory_domains": [ 00:14:09.727 { 00:14:09.727 "dma_device_id": "system", 00:14:09.727 "dma_device_type": 1 00:14:09.727 }, 00:14:09.727 { 00:14:09.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.727 "dma_device_type": 2 00:14:09.727 } 00:14:09.727 ], 00:14:09.727 "driver_specific": {} 00:14:09.727 } 00:14:09.727 ] 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.727 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.293 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.294 "name": "Existed_Raid", 00:14:10.294 "uuid": "4bd526cb-0a4e-4423-96a6-f33128f73f5d", 00:14:10.294 "strip_size_kb": 64, 00:14:10.294 "state": "online", 00:14:10.294 "raid_level": "concat", 00:14:10.294 "superblock": false, 00:14:10.294 "num_base_bdevs": 3, 00:14:10.294 "num_base_bdevs_discovered": 3, 00:14:10.294 "num_base_bdevs_operational": 3, 00:14:10.294 "base_bdevs_list": [ 00:14:10.294 { 00:14:10.294 "name": "BaseBdev1", 00:14:10.294 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:10.294 "is_configured": true, 00:14:10.294 "data_offset": 0, 00:14:10.294 "data_size": 65536 00:14:10.294 }, 00:14:10.294 { 00:14:10.294 "name": "BaseBdev2", 00:14:10.294 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:10.294 "is_configured": true, 00:14:10.294 "data_offset": 0, 00:14:10.294 "data_size": 65536 00:14:10.294 }, 00:14:10.294 { 00:14:10.294 "name": "BaseBdev3", 00:14:10.294 "uuid": "42ce30ab-e070-4c7e-b50e-07b837dbe51e", 00:14:10.294 "is_configured": true, 00:14:10.294 "data_offset": 0, 00:14:10.294 "data_size": 65536 00:14:10.294 } 00:14:10.294 ] 00:14:10.294 }' 00:14:10.294 13:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.294 13:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:10.861 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.119 [2024-07-15 13:32:50.449974] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.119 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.119 "name": "Existed_Raid", 00:14:11.119 "aliases": [ 00:14:11.119 "4bd526cb-0a4e-4423-96a6-f33128f73f5d" 00:14:11.119 ], 00:14:11.120 "product_name": "Raid Volume", 00:14:11.120 "block_size": 512, 00:14:11.120 "num_blocks": 196608, 00:14:11.120 "uuid": "4bd526cb-0a4e-4423-96a6-f33128f73f5d", 00:14:11.120 "assigned_rate_limits": { 00:14:11.120 "rw_ios_per_sec": 0, 00:14:11.120 "rw_mbytes_per_sec": 0, 00:14:11.120 "r_mbytes_per_sec": 0, 00:14:11.120 "w_mbytes_per_sec": 0 00:14:11.120 }, 00:14:11.120 "claimed": false, 00:14:11.120 "zoned": false, 00:14:11.120 "supported_io_types": { 00:14:11.120 "read": true, 00:14:11.120 "write": true, 00:14:11.120 "unmap": true, 00:14:11.120 "flush": true, 00:14:11.120 "reset": true, 00:14:11.120 "nvme_admin": false, 00:14:11.120 "nvme_io": false, 00:14:11.120 "nvme_io_md": false, 00:14:11.120 "write_zeroes": true, 00:14:11.120 "zcopy": false, 00:14:11.120 "get_zone_info": false, 00:14:11.120 "zone_management": false, 00:14:11.120 "zone_append": false, 00:14:11.120 "compare": false, 00:14:11.120 "compare_and_write": false, 00:14:11.120 "abort": false, 00:14:11.120 "seek_hole": false, 00:14:11.120 "seek_data": false, 00:14:11.120 "copy": false, 00:14:11.120 "nvme_iov_md": false 00:14:11.120 }, 00:14:11.120 "memory_domains": [ 00:14:11.120 { 00:14:11.120 "dma_device_id": "system", 00:14:11.120 "dma_device_type": 1 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.120 "dma_device_type": 2 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "dma_device_id": "system", 00:14:11.120 "dma_device_type": 1 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.120 "dma_device_type": 2 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "dma_device_id": "system", 00:14:11.120 "dma_device_type": 1 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.120 "dma_device_type": 2 00:14:11.120 } 00:14:11.120 ], 00:14:11.120 "driver_specific": { 00:14:11.120 "raid": { 00:14:11.120 "uuid": "4bd526cb-0a4e-4423-96a6-f33128f73f5d", 00:14:11.120 "strip_size_kb": 64, 00:14:11.120 "state": "online", 00:14:11.120 "raid_level": "concat", 00:14:11.120 "superblock": false, 00:14:11.120 "num_base_bdevs": 3, 00:14:11.120 "num_base_bdevs_discovered": 3, 00:14:11.120 "num_base_bdevs_operational": 3, 00:14:11.120 "base_bdevs_list": [ 00:14:11.120 { 00:14:11.120 "name": "BaseBdev1", 00:14:11.120 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:11.120 "is_configured": true, 00:14:11.120 "data_offset": 0, 00:14:11.120 "data_size": 65536 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "name": "BaseBdev2", 00:14:11.120 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:11.120 "is_configured": true, 00:14:11.120 "data_offset": 0, 00:14:11.120 "data_size": 65536 00:14:11.120 }, 00:14:11.120 { 00:14:11.120 "name": "BaseBdev3", 00:14:11.120 "uuid": "42ce30ab-e070-4c7e-b50e-07b837dbe51e", 00:14:11.120 "is_configured": true, 00:14:11.120 "data_offset": 0, 00:14:11.120 "data_size": 65536 00:14:11.120 } 00:14:11.120 ] 00:14:11.120 } 00:14:11.120 } 00:14:11.120 }' 00:14:11.120 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.120 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:11.120 BaseBdev2 00:14:11.120 BaseBdev3' 00:14:11.120 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.120 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:11.120 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:11.383 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:11.384 "name": "BaseBdev1", 00:14:11.384 "aliases": [ 00:14:11.384 "c4aa16af-b71e-4094-849a-eaaea95d55ed" 00:14:11.384 ], 00:14:11.384 "product_name": "Malloc disk", 00:14:11.384 "block_size": 512, 00:14:11.384 "num_blocks": 65536, 00:14:11.384 "uuid": "c4aa16af-b71e-4094-849a-eaaea95d55ed", 00:14:11.384 "assigned_rate_limits": { 00:14:11.384 "rw_ios_per_sec": 0, 00:14:11.384 "rw_mbytes_per_sec": 0, 00:14:11.384 "r_mbytes_per_sec": 0, 00:14:11.384 "w_mbytes_per_sec": 0 00:14:11.384 }, 00:14:11.384 "claimed": true, 00:14:11.384 "claim_type": "exclusive_write", 00:14:11.384 "zoned": false, 00:14:11.384 "supported_io_types": { 00:14:11.384 "read": true, 00:14:11.384 "write": true, 00:14:11.384 "unmap": true, 00:14:11.384 "flush": true, 00:14:11.384 "reset": true, 00:14:11.384 "nvme_admin": false, 00:14:11.384 "nvme_io": false, 00:14:11.384 "nvme_io_md": false, 00:14:11.384 "write_zeroes": true, 00:14:11.384 "zcopy": true, 00:14:11.384 "get_zone_info": false, 00:14:11.384 "zone_management": false, 00:14:11.384 "zone_append": false, 00:14:11.384 "compare": false, 00:14:11.384 "compare_and_write": false, 00:14:11.384 "abort": true, 00:14:11.384 "seek_hole": false, 00:14:11.384 "seek_data": false, 00:14:11.384 "copy": true, 00:14:11.384 "nvme_iov_md": false 00:14:11.384 }, 00:14:11.384 "memory_domains": [ 00:14:11.384 { 00:14:11.384 "dma_device_id": "system", 00:14:11.384 "dma_device_type": 1 00:14:11.384 }, 00:14:11.384 { 00:14:11.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.384 "dma_device_type": 2 00:14:11.384 } 00:14:11.384 ], 00:14:11.384 "driver_specific": {} 00:14:11.384 }' 00:14:11.384 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:11.647 13:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.647 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.647 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:11.647 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.907 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.907 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:11.907 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.907 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:11.907 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.221 "name": "BaseBdev2", 00:14:12.221 "aliases": [ 00:14:12.221 "aa20c3ab-b0df-4da2-927c-e23546fa0e42" 00:14:12.221 ], 00:14:12.221 "product_name": "Malloc disk", 00:14:12.221 "block_size": 512, 00:14:12.221 "num_blocks": 65536, 00:14:12.221 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:12.221 "assigned_rate_limits": { 00:14:12.221 "rw_ios_per_sec": 0, 00:14:12.221 "rw_mbytes_per_sec": 0, 00:14:12.221 "r_mbytes_per_sec": 0, 00:14:12.221 "w_mbytes_per_sec": 0 00:14:12.221 }, 00:14:12.221 "claimed": true, 00:14:12.221 "claim_type": "exclusive_write", 00:14:12.221 "zoned": false, 00:14:12.221 "supported_io_types": { 00:14:12.221 "read": true, 00:14:12.221 "write": true, 00:14:12.221 "unmap": true, 00:14:12.221 "flush": true, 00:14:12.221 "reset": true, 00:14:12.221 "nvme_admin": false, 00:14:12.221 "nvme_io": false, 00:14:12.221 "nvme_io_md": false, 00:14:12.221 "write_zeroes": true, 00:14:12.221 "zcopy": true, 00:14:12.221 "get_zone_info": false, 00:14:12.221 "zone_management": false, 00:14:12.221 "zone_append": false, 00:14:12.221 "compare": false, 00:14:12.221 "compare_and_write": false, 00:14:12.221 "abort": true, 00:14:12.221 "seek_hole": false, 00:14:12.221 "seek_data": false, 00:14:12.221 "copy": true, 00:14:12.221 "nvme_iov_md": false 00:14:12.221 }, 00:14:12.221 "memory_domains": [ 00:14:12.221 { 00:14:12.221 "dma_device_id": "system", 00:14:12.221 "dma_device_type": 1 00:14:12.221 }, 00:14:12.221 { 00:14:12.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.221 "dma_device_type": 2 00:14:12.221 } 00:14:12.221 ], 00:14:12.221 "driver_specific": {} 00:14:12.221 }' 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.221 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.481 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.481 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.481 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.481 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:12.481 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.740 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.740 "name": "BaseBdev3", 00:14:12.740 "aliases": [ 00:14:12.740 "42ce30ab-e070-4c7e-b50e-07b837dbe51e" 00:14:12.740 ], 00:14:12.740 "product_name": "Malloc disk", 00:14:12.740 "block_size": 512, 00:14:12.740 "num_blocks": 65536, 00:14:12.740 "uuid": "42ce30ab-e070-4c7e-b50e-07b837dbe51e", 00:14:12.740 "assigned_rate_limits": { 00:14:12.740 "rw_ios_per_sec": 0, 00:14:12.740 "rw_mbytes_per_sec": 0, 00:14:12.740 "r_mbytes_per_sec": 0, 00:14:12.740 "w_mbytes_per_sec": 0 00:14:12.740 }, 00:14:12.740 "claimed": true, 00:14:12.740 "claim_type": "exclusive_write", 00:14:12.740 "zoned": false, 00:14:12.740 "supported_io_types": { 00:14:12.740 "read": true, 00:14:12.740 "write": true, 00:14:12.740 "unmap": true, 00:14:12.740 "flush": true, 00:14:12.740 "reset": true, 00:14:12.740 "nvme_admin": false, 00:14:12.740 "nvme_io": false, 00:14:12.740 "nvme_io_md": false, 00:14:12.740 "write_zeroes": true, 00:14:12.740 "zcopy": true, 00:14:12.740 "get_zone_info": false, 00:14:12.740 "zone_management": false, 00:14:12.740 "zone_append": false, 00:14:12.740 "compare": false, 00:14:12.740 "compare_and_write": false, 00:14:12.740 "abort": true, 00:14:12.740 "seek_hole": false, 00:14:12.740 "seek_data": false, 00:14:12.740 "copy": true, 00:14:12.740 "nvme_iov_md": false 00:14:12.740 }, 00:14:12.740 "memory_domains": [ 00:14:12.740 { 00:14:12.740 "dma_device_id": "system", 00:14:12.740 "dma_device_type": 1 00:14:12.740 }, 00:14:12.740 { 00:14:12.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.740 "dma_device_type": 2 00:14:12.740 } 00:14:12.740 ], 00:14:12.740 "driver_specific": {} 00:14:12.740 }' 00:14:12.740 13:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.740 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.999 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:13.258 [2024-07-15 13:32:52.515223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:13.258 [2024-07-15 13:32:52.515252] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:13.258 [2024-07-15 13:32:52.515294] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.258 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.517 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.517 "name": "Existed_Raid", 00:14:13.517 "uuid": "4bd526cb-0a4e-4423-96a6-f33128f73f5d", 00:14:13.517 "strip_size_kb": 64, 00:14:13.517 "state": "offline", 00:14:13.517 "raid_level": "concat", 00:14:13.517 "superblock": false, 00:14:13.517 "num_base_bdevs": 3, 00:14:13.517 "num_base_bdevs_discovered": 2, 00:14:13.517 "num_base_bdevs_operational": 2, 00:14:13.517 "base_bdevs_list": [ 00:14:13.517 { 00:14:13.517 "name": null, 00:14:13.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.517 "is_configured": false, 00:14:13.517 "data_offset": 0, 00:14:13.517 "data_size": 65536 00:14:13.517 }, 00:14:13.517 { 00:14:13.517 "name": "BaseBdev2", 00:14:13.518 "uuid": "aa20c3ab-b0df-4da2-927c-e23546fa0e42", 00:14:13.518 "is_configured": true, 00:14:13.518 "data_offset": 0, 00:14:13.518 "data_size": 65536 00:14:13.518 }, 00:14:13.518 { 00:14:13.518 "name": "BaseBdev3", 00:14:13.518 "uuid": "42ce30ab-e070-4c7e-b50e-07b837dbe51e", 00:14:13.518 "is_configured": true, 00:14:13.518 "data_offset": 0, 00:14:13.518 "data_size": 65536 00:14:13.518 } 00:14:13.518 ] 00:14:13.518 }' 00:14:13.518 13:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.518 13:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.084 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:14.084 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:14.084 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.084 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:14.342 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:14.342 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:14.342 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:14.600 [2024-07-15 13:32:53.807776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:14.600 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:14.600 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:14.600 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.600 13:32:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:14.859 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:14.859 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:14.859 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:15.119 [2024-07-15 13:32:54.309365] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:15.119 [2024-07-15 13:32:54.309420] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228e400 name Existed_Raid, state offline 00:14:15.119 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:15.119 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:15.119 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.119 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:15.378 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:15.638 BaseBdev2 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:15.638 13:32:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:15.898 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:15.898 [ 00:14:15.898 { 00:14:15.898 "name": "BaseBdev2", 00:14:15.898 "aliases": [ 00:14:15.898 "e36674d3-fb10-4a52-b5fe-36813433d223" 00:14:15.898 ], 00:14:15.898 "product_name": "Malloc disk", 00:14:15.898 "block_size": 512, 00:14:15.898 "num_blocks": 65536, 00:14:15.898 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:15.898 "assigned_rate_limits": { 00:14:15.898 "rw_ios_per_sec": 0, 00:14:15.898 "rw_mbytes_per_sec": 0, 00:14:15.898 "r_mbytes_per_sec": 0, 00:14:15.898 "w_mbytes_per_sec": 0 00:14:15.898 }, 00:14:15.898 "claimed": false, 00:14:15.898 "zoned": false, 00:14:15.898 "supported_io_types": { 00:14:15.898 "read": true, 00:14:15.898 "write": true, 00:14:15.898 "unmap": true, 00:14:15.898 "flush": true, 00:14:15.898 "reset": true, 00:14:15.898 "nvme_admin": false, 00:14:15.898 "nvme_io": false, 00:14:15.898 "nvme_io_md": false, 00:14:15.898 "write_zeroes": true, 00:14:15.898 "zcopy": true, 00:14:15.898 "get_zone_info": false, 00:14:15.898 "zone_management": false, 00:14:15.898 "zone_append": false, 00:14:15.898 "compare": false, 00:14:15.898 "compare_and_write": false, 00:14:15.898 "abort": true, 00:14:15.898 "seek_hole": false, 00:14:15.898 "seek_data": false, 00:14:15.898 "copy": true, 00:14:15.898 "nvme_iov_md": false 00:14:15.898 }, 00:14:15.898 "memory_domains": [ 00:14:15.898 { 00:14:15.898 "dma_device_id": "system", 00:14:15.898 "dma_device_type": 1 00:14:15.898 }, 00:14:15.898 { 00:14:15.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.898 "dma_device_type": 2 00:14:15.898 } 00:14:15.898 ], 00:14:15.898 "driver_specific": {} 00:14:15.898 } 00:14:15.898 ] 00:14:15.898 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:15.898 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:15.898 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:15.898 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:16.156 BaseBdev3 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:16.156 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:16.415 13:32:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:16.674 [ 00:14:16.674 { 00:14:16.674 "name": "BaseBdev3", 00:14:16.674 "aliases": [ 00:14:16.674 "1e9fa3b2-3da9-48ce-a035-a074ff48c62a" 00:14:16.674 ], 00:14:16.674 "product_name": "Malloc disk", 00:14:16.674 "block_size": 512, 00:14:16.674 "num_blocks": 65536, 00:14:16.674 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:16.674 "assigned_rate_limits": { 00:14:16.674 "rw_ios_per_sec": 0, 00:14:16.674 "rw_mbytes_per_sec": 0, 00:14:16.674 "r_mbytes_per_sec": 0, 00:14:16.674 "w_mbytes_per_sec": 0 00:14:16.674 }, 00:14:16.674 "claimed": false, 00:14:16.674 "zoned": false, 00:14:16.674 "supported_io_types": { 00:14:16.674 "read": true, 00:14:16.674 "write": true, 00:14:16.674 "unmap": true, 00:14:16.674 "flush": true, 00:14:16.674 "reset": true, 00:14:16.674 "nvme_admin": false, 00:14:16.674 "nvme_io": false, 00:14:16.674 "nvme_io_md": false, 00:14:16.674 "write_zeroes": true, 00:14:16.674 "zcopy": true, 00:14:16.674 "get_zone_info": false, 00:14:16.674 "zone_management": false, 00:14:16.674 "zone_append": false, 00:14:16.674 "compare": false, 00:14:16.674 "compare_and_write": false, 00:14:16.674 "abort": true, 00:14:16.674 "seek_hole": false, 00:14:16.674 "seek_data": false, 00:14:16.674 "copy": true, 00:14:16.674 "nvme_iov_md": false 00:14:16.674 }, 00:14:16.674 "memory_domains": [ 00:14:16.674 { 00:14:16.674 "dma_device_id": "system", 00:14:16.674 "dma_device_type": 1 00:14:16.674 }, 00:14:16.674 { 00:14:16.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.674 "dma_device_type": 2 00:14:16.674 } 00:14:16.674 ], 00:14:16.674 "driver_specific": {} 00:14:16.674 } 00:14:16.674 ] 00:14:16.674 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:16.674 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:16.674 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:16.674 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:16.933 [2024-07-15 13:32:56.280978] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:16.933 [2024-07-15 13:32:56.281022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:16.933 [2024-07-15 13:32:56.281043] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:16.933 [2024-07-15 13:32:56.282361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.933 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.192 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.192 "name": "Existed_Raid", 00:14:17.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.192 "strip_size_kb": 64, 00:14:17.192 "state": "configuring", 00:14:17.192 "raid_level": "concat", 00:14:17.192 "superblock": false, 00:14:17.192 "num_base_bdevs": 3, 00:14:17.193 "num_base_bdevs_discovered": 2, 00:14:17.193 "num_base_bdevs_operational": 3, 00:14:17.193 "base_bdevs_list": [ 00:14:17.193 { 00:14:17.193 "name": "BaseBdev1", 00:14:17.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.193 "is_configured": false, 00:14:17.193 "data_offset": 0, 00:14:17.193 "data_size": 0 00:14:17.193 }, 00:14:17.193 { 00:14:17.193 "name": "BaseBdev2", 00:14:17.193 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:17.193 "is_configured": true, 00:14:17.193 "data_offset": 0, 00:14:17.193 "data_size": 65536 00:14:17.193 }, 00:14:17.193 { 00:14:17.193 "name": "BaseBdev3", 00:14:17.193 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:17.193 "is_configured": true, 00:14:17.193 "data_offset": 0, 00:14:17.193 "data_size": 65536 00:14:17.193 } 00:14:17.193 ] 00:14:17.193 }' 00:14:17.193 13:32:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.193 13:32:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.761 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:18.019 [2024-07-15 13:32:57.367828] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.020 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.278 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.278 "name": "Existed_Raid", 00:14:18.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.278 "strip_size_kb": 64, 00:14:18.278 "state": "configuring", 00:14:18.278 "raid_level": "concat", 00:14:18.278 "superblock": false, 00:14:18.278 "num_base_bdevs": 3, 00:14:18.278 "num_base_bdevs_discovered": 1, 00:14:18.278 "num_base_bdevs_operational": 3, 00:14:18.278 "base_bdevs_list": [ 00:14:18.278 { 00:14:18.278 "name": "BaseBdev1", 00:14:18.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.278 "is_configured": false, 00:14:18.278 "data_offset": 0, 00:14:18.278 "data_size": 0 00:14:18.278 }, 00:14:18.278 { 00:14:18.278 "name": null, 00:14:18.278 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:18.278 "is_configured": false, 00:14:18.279 "data_offset": 0, 00:14:18.279 "data_size": 65536 00:14:18.279 }, 00:14:18.279 { 00:14:18.279 "name": "BaseBdev3", 00:14:18.279 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:18.279 "is_configured": true, 00:14:18.279 "data_offset": 0, 00:14:18.279 "data_size": 65536 00:14:18.279 } 00:14:18.279 ] 00:14:18.279 }' 00:14:18.279 13:32:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.279 13:32:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.845 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.845 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:19.105 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:19.105 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:19.364 [2024-07-15 13:32:58.728027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:19.364 BaseBdev1 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:19.364 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:19.623 13:32:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:20.192 [ 00:14:20.192 { 00:14:20.192 "name": "BaseBdev1", 00:14:20.192 "aliases": [ 00:14:20.192 "527d0450-a60a-4659-84b8-61cb39eb40f2" 00:14:20.192 ], 00:14:20.192 "product_name": "Malloc disk", 00:14:20.192 "block_size": 512, 00:14:20.192 "num_blocks": 65536, 00:14:20.192 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:20.192 "assigned_rate_limits": { 00:14:20.192 "rw_ios_per_sec": 0, 00:14:20.192 "rw_mbytes_per_sec": 0, 00:14:20.192 "r_mbytes_per_sec": 0, 00:14:20.192 "w_mbytes_per_sec": 0 00:14:20.192 }, 00:14:20.192 "claimed": true, 00:14:20.192 "claim_type": "exclusive_write", 00:14:20.192 "zoned": false, 00:14:20.192 "supported_io_types": { 00:14:20.192 "read": true, 00:14:20.192 "write": true, 00:14:20.192 "unmap": true, 00:14:20.192 "flush": true, 00:14:20.192 "reset": true, 00:14:20.192 "nvme_admin": false, 00:14:20.192 "nvme_io": false, 00:14:20.192 "nvme_io_md": false, 00:14:20.192 "write_zeroes": true, 00:14:20.192 "zcopy": true, 00:14:20.192 "get_zone_info": false, 00:14:20.192 "zone_management": false, 00:14:20.192 "zone_append": false, 00:14:20.192 "compare": false, 00:14:20.192 "compare_and_write": false, 00:14:20.192 "abort": true, 00:14:20.192 "seek_hole": false, 00:14:20.192 "seek_data": false, 00:14:20.192 "copy": true, 00:14:20.192 "nvme_iov_md": false 00:14:20.192 }, 00:14:20.192 "memory_domains": [ 00:14:20.192 { 00:14:20.192 "dma_device_id": "system", 00:14:20.192 "dma_device_type": 1 00:14:20.192 }, 00:14:20.192 { 00:14:20.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.192 "dma_device_type": 2 00:14:20.192 } 00:14:20.192 ], 00:14:20.192 "driver_specific": {} 00:14:20.192 } 00:14:20.192 ] 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.192 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.452 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.452 "name": "Existed_Raid", 00:14:20.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.452 "strip_size_kb": 64, 00:14:20.452 "state": "configuring", 00:14:20.452 "raid_level": "concat", 00:14:20.452 "superblock": false, 00:14:20.452 "num_base_bdevs": 3, 00:14:20.452 "num_base_bdevs_discovered": 2, 00:14:20.452 "num_base_bdevs_operational": 3, 00:14:20.452 "base_bdevs_list": [ 00:14:20.452 { 00:14:20.452 "name": "BaseBdev1", 00:14:20.452 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:20.452 "is_configured": true, 00:14:20.452 "data_offset": 0, 00:14:20.452 "data_size": 65536 00:14:20.452 }, 00:14:20.452 { 00:14:20.452 "name": null, 00:14:20.452 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:20.452 "is_configured": false, 00:14:20.452 "data_offset": 0, 00:14:20.452 "data_size": 65536 00:14:20.452 }, 00:14:20.452 { 00:14:20.452 "name": "BaseBdev3", 00:14:20.452 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:20.452 "is_configured": true, 00:14:20.452 "data_offset": 0, 00:14:20.452 "data_size": 65536 00:14:20.452 } 00:14:20.452 ] 00:14:20.452 }' 00:14:20.452 13:32:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.452 13:32:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.020 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.020 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:21.279 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:21.279 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:21.539 [2024-07-15 13:33:00.777495] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.539 13:33:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.798 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.798 "name": "Existed_Raid", 00:14:21.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.798 "strip_size_kb": 64, 00:14:21.798 "state": "configuring", 00:14:21.798 "raid_level": "concat", 00:14:21.798 "superblock": false, 00:14:21.798 "num_base_bdevs": 3, 00:14:21.798 "num_base_bdevs_discovered": 1, 00:14:21.798 "num_base_bdevs_operational": 3, 00:14:21.798 "base_bdevs_list": [ 00:14:21.798 { 00:14:21.798 "name": "BaseBdev1", 00:14:21.798 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:21.798 "is_configured": true, 00:14:21.798 "data_offset": 0, 00:14:21.798 "data_size": 65536 00:14:21.798 }, 00:14:21.798 { 00:14:21.798 "name": null, 00:14:21.798 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:21.798 "is_configured": false, 00:14:21.798 "data_offset": 0, 00:14:21.798 "data_size": 65536 00:14:21.798 }, 00:14:21.798 { 00:14:21.798 "name": null, 00:14:21.798 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:21.798 "is_configured": false, 00:14:21.798 "data_offset": 0, 00:14:21.798 "data_size": 65536 00:14:21.798 } 00:14:21.798 ] 00:14:21.798 }' 00:14:21.798 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.798 13:33:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.365 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.365 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:22.624 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:22.624 13:33:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:22.882 [2024-07-15 13:33:02.141130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.882 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.141 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.141 "name": "Existed_Raid", 00:14:23.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.141 "strip_size_kb": 64, 00:14:23.141 "state": "configuring", 00:14:23.141 "raid_level": "concat", 00:14:23.141 "superblock": false, 00:14:23.141 "num_base_bdevs": 3, 00:14:23.141 "num_base_bdevs_discovered": 2, 00:14:23.141 "num_base_bdevs_operational": 3, 00:14:23.141 "base_bdevs_list": [ 00:14:23.141 { 00:14:23.141 "name": "BaseBdev1", 00:14:23.141 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:23.141 "is_configured": true, 00:14:23.141 "data_offset": 0, 00:14:23.141 "data_size": 65536 00:14:23.141 }, 00:14:23.141 { 00:14:23.141 "name": null, 00:14:23.141 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:23.141 "is_configured": false, 00:14:23.141 "data_offset": 0, 00:14:23.141 "data_size": 65536 00:14:23.141 }, 00:14:23.141 { 00:14:23.141 "name": "BaseBdev3", 00:14:23.141 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:23.141 "is_configured": true, 00:14:23.141 "data_offset": 0, 00:14:23.141 "data_size": 65536 00:14:23.141 } 00:14:23.141 ] 00:14:23.141 }' 00:14:23.141 13:33:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.141 13:33:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.706 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.706 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:23.974 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:23.974 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:24.240 [2024-07-15 13:33:03.488722] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.240 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.499 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.499 "name": "Existed_Raid", 00:14:24.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.499 "strip_size_kb": 64, 00:14:24.499 "state": "configuring", 00:14:24.499 "raid_level": "concat", 00:14:24.499 "superblock": false, 00:14:24.499 "num_base_bdevs": 3, 00:14:24.499 "num_base_bdevs_discovered": 1, 00:14:24.499 "num_base_bdevs_operational": 3, 00:14:24.499 "base_bdevs_list": [ 00:14:24.499 { 00:14:24.499 "name": null, 00:14:24.499 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:24.499 "is_configured": false, 00:14:24.499 "data_offset": 0, 00:14:24.499 "data_size": 65536 00:14:24.499 }, 00:14:24.499 { 00:14:24.499 "name": null, 00:14:24.499 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:24.499 "is_configured": false, 00:14:24.499 "data_offset": 0, 00:14:24.499 "data_size": 65536 00:14:24.499 }, 00:14:24.499 { 00:14:24.499 "name": "BaseBdev3", 00:14:24.499 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:24.499 "is_configured": true, 00:14:24.499 "data_offset": 0, 00:14:24.499 "data_size": 65536 00:14:24.499 } 00:14:24.499 ] 00:14:24.499 }' 00:14:24.499 13:33:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.499 13:33:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.065 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.065 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:25.324 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:25.324 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:25.583 [2024-07-15 13:33:04.820594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.583 13:33:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.842 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.843 "name": "Existed_Raid", 00:14:25.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.843 "strip_size_kb": 64, 00:14:25.843 "state": "configuring", 00:14:25.843 "raid_level": "concat", 00:14:25.843 "superblock": false, 00:14:25.843 "num_base_bdevs": 3, 00:14:25.843 "num_base_bdevs_discovered": 2, 00:14:25.843 "num_base_bdevs_operational": 3, 00:14:25.843 "base_bdevs_list": [ 00:14:25.843 { 00:14:25.843 "name": null, 00:14:25.843 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:25.843 "is_configured": false, 00:14:25.843 "data_offset": 0, 00:14:25.843 "data_size": 65536 00:14:25.843 }, 00:14:25.843 { 00:14:25.843 "name": "BaseBdev2", 00:14:25.843 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:25.843 "is_configured": true, 00:14:25.843 "data_offset": 0, 00:14:25.843 "data_size": 65536 00:14:25.843 }, 00:14:25.843 { 00:14:25.843 "name": "BaseBdev3", 00:14:25.843 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:25.843 "is_configured": true, 00:14:25.843 "data_offset": 0, 00:14:25.843 "data_size": 65536 00:14:25.843 } 00:14:25.843 ] 00:14:25.843 }' 00:14:25.843 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.843 13:33:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.440 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.440 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:26.699 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:26.699 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.699 13:33:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:26.958 13:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 527d0450-a60a-4659-84b8-61cb39eb40f2 00:14:27.527 [2024-07-15 13:33:06.690023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:27.527 [2024-07-15 13:33:06.690066] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x228c450 00:14:27.527 [2024-07-15 13:33:06.690075] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:27.527 [2024-07-15 13:33:06.690273] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x228ded0 00:14:27.527 [2024-07-15 13:33:06.690388] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228c450 00:14:27.527 [2024-07-15 13:33:06.690398] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x228c450 00:14:27.527 [2024-07-15 13:33:06.690571] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.527 NewBaseBdev 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:27.527 13:33:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:27.787 [ 00:14:27.787 { 00:14:27.787 "name": "NewBaseBdev", 00:14:27.787 "aliases": [ 00:14:27.787 "527d0450-a60a-4659-84b8-61cb39eb40f2" 00:14:27.787 ], 00:14:27.787 "product_name": "Malloc disk", 00:14:27.787 "block_size": 512, 00:14:27.787 "num_blocks": 65536, 00:14:27.787 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:27.787 "assigned_rate_limits": { 00:14:27.787 "rw_ios_per_sec": 0, 00:14:27.787 "rw_mbytes_per_sec": 0, 00:14:27.787 "r_mbytes_per_sec": 0, 00:14:27.787 "w_mbytes_per_sec": 0 00:14:27.787 }, 00:14:27.787 "claimed": true, 00:14:27.787 "claim_type": "exclusive_write", 00:14:27.787 "zoned": false, 00:14:27.787 "supported_io_types": { 00:14:27.787 "read": true, 00:14:27.787 "write": true, 00:14:27.787 "unmap": true, 00:14:27.787 "flush": true, 00:14:27.787 "reset": true, 00:14:27.787 "nvme_admin": false, 00:14:27.787 "nvme_io": false, 00:14:27.787 "nvme_io_md": false, 00:14:27.787 "write_zeroes": true, 00:14:27.787 "zcopy": true, 00:14:27.787 "get_zone_info": false, 00:14:27.787 "zone_management": false, 00:14:27.787 "zone_append": false, 00:14:27.787 "compare": false, 00:14:27.787 "compare_and_write": false, 00:14:27.787 "abort": true, 00:14:27.787 "seek_hole": false, 00:14:27.787 "seek_data": false, 00:14:27.787 "copy": true, 00:14:27.787 "nvme_iov_md": false 00:14:27.787 }, 00:14:27.787 "memory_domains": [ 00:14:27.787 { 00:14:27.787 "dma_device_id": "system", 00:14:27.787 "dma_device_type": 1 00:14:27.787 }, 00:14:27.787 { 00:14:27.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.787 "dma_device_type": 2 00:14:27.787 } 00:14:27.787 ], 00:14:27.787 "driver_specific": {} 00:14:27.787 } 00:14:27.787 ] 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.787 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.047 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.047 "name": "Existed_Raid", 00:14:28.047 "uuid": "8d517ccc-6b35-47c5-bab7-32687cd97e80", 00:14:28.047 "strip_size_kb": 64, 00:14:28.047 "state": "online", 00:14:28.047 "raid_level": "concat", 00:14:28.047 "superblock": false, 00:14:28.047 "num_base_bdevs": 3, 00:14:28.047 "num_base_bdevs_discovered": 3, 00:14:28.047 "num_base_bdevs_operational": 3, 00:14:28.047 "base_bdevs_list": [ 00:14:28.047 { 00:14:28.047 "name": "NewBaseBdev", 00:14:28.047 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:28.047 "is_configured": true, 00:14:28.047 "data_offset": 0, 00:14:28.047 "data_size": 65536 00:14:28.047 }, 00:14:28.047 { 00:14:28.047 "name": "BaseBdev2", 00:14:28.047 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:28.047 "is_configured": true, 00:14:28.047 "data_offset": 0, 00:14:28.047 "data_size": 65536 00:14:28.047 }, 00:14:28.047 { 00:14:28.047 "name": "BaseBdev3", 00:14:28.047 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:28.047 "is_configured": true, 00:14:28.047 "data_offset": 0, 00:14:28.047 "data_size": 65536 00:14:28.047 } 00:14:28.047 ] 00:14:28.047 }' 00:14:28.047 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.047 13:33:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.615 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:28.616 13:33:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:28.875 [2024-07-15 13:33:08.094045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:28.875 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:28.875 "name": "Existed_Raid", 00:14:28.875 "aliases": [ 00:14:28.875 "8d517ccc-6b35-47c5-bab7-32687cd97e80" 00:14:28.875 ], 00:14:28.875 "product_name": "Raid Volume", 00:14:28.875 "block_size": 512, 00:14:28.875 "num_blocks": 196608, 00:14:28.875 "uuid": "8d517ccc-6b35-47c5-bab7-32687cd97e80", 00:14:28.875 "assigned_rate_limits": { 00:14:28.875 "rw_ios_per_sec": 0, 00:14:28.875 "rw_mbytes_per_sec": 0, 00:14:28.875 "r_mbytes_per_sec": 0, 00:14:28.875 "w_mbytes_per_sec": 0 00:14:28.875 }, 00:14:28.875 "claimed": false, 00:14:28.875 "zoned": false, 00:14:28.875 "supported_io_types": { 00:14:28.875 "read": true, 00:14:28.875 "write": true, 00:14:28.875 "unmap": true, 00:14:28.875 "flush": true, 00:14:28.875 "reset": true, 00:14:28.875 "nvme_admin": false, 00:14:28.875 "nvme_io": false, 00:14:28.875 "nvme_io_md": false, 00:14:28.875 "write_zeroes": true, 00:14:28.875 "zcopy": false, 00:14:28.875 "get_zone_info": false, 00:14:28.875 "zone_management": false, 00:14:28.875 "zone_append": false, 00:14:28.875 "compare": false, 00:14:28.875 "compare_and_write": false, 00:14:28.875 "abort": false, 00:14:28.875 "seek_hole": false, 00:14:28.875 "seek_data": false, 00:14:28.875 "copy": false, 00:14:28.875 "nvme_iov_md": false 00:14:28.875 }, 00:14:28.875 "memory_domains": [ 00:14:28.875 { 00:14:28.875 "dma_device_id": "system", 00:14:28.875 "dma_device_type": 1 00:14:28.875 }, 00:14:28.875 { 00:14:28.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.875 "dma_device_type": 2 00:14:28.875 }, 00:14:28.875 { 00:14:28.875 "dma_device_id": "system", 00:14:28.875 "dma_device_type": 1 00:14:28.875 }, 00:14:28.875 { 00:14:28.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.875 "dma_device_type": 2 00:14:28.875 }, 00:14:28.875 { 00:14:28.875 "dma_device_id": "system", 00:14:28.875 "dma_device_type": 1 00:14:28.875 }, 00:14:28.875 { 00:14:28.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.875 "dma_device_type": 2 00:14:28.875 } 00:14:28.875 ], 00:14:28.875 "driver_specific": { 00:14:28.875 "raid": { 00:14:28.875 "uuid": "8d517ccc-6b35-47c5-bab7-32687cd97e80", 00:14:28.875 "strip_size_kb": 64, 00:14:28.875 "state": "online", 00:14:28.876 "raid_level": "concat", 00:14:28.876 "superblock": false, 00:14:28.876 "num_base_bdevs": 3, 00:14:28.876 "num_base_bdevs_discovered": 3, 00:14:28.876 "num_base_bdevs_operational": 3, 00:14:28.876 "base_bdevs_list": [ 00:14:28.876 { 00:14:28.876 "name": "NewBaseBdev", 00:14:28.876 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:28.876 "is_configured": true, 00:14:28.876 "data_offset": 0, 00:14:28.876 "data_size": 65536 00:14:28.876 }, 00:14:28.876 { 00:14:28.876 "name": "BaseBdev2", 00:14:28.876 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:28.876 "is_configured": true, 00:14:28.876 "data_offset": 0, 00:14:28.876 "data_size": 65536 00:14:28.876 }, 00:14:28.876 { 00:14:28.876 "name": "BaseBdev3", 00:14:28.876 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:28.876 "is_configured": true, 00:14:28.876 "data_offset": 0, 00:14:28.876 "data_size": 65536 00:14:28.876 } 00:14:28.876 ] 00:14:28.876 } 00:14:28.876 } 00:14:28.876 }' 00:14:28.876 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:28.876 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:28.876 BaseBdev2 00:14:28.876 BaseBdev3' 00:14:28.876 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:28.876 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:28.876 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:29.135 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:29.135 "name": "NewBaseBdev", 00:14:29.135 "aliases": [ 00:14:29.135 "527d0450-a60a-4659-84b8-61cb39eb40f2" 00:14:29.135 ], 00:14:29.135 "product_name": "Malloc disk", 00:14:29.135 "block_size": 512, 00:14:29.135 "num_blocks": 65536, 00:14:29.135 "uuid": "527d0450-a60a-4659-84b8-61cb39eb40f2", 00:14:29.135 "assigned_rate_limits": { 00:14:29.135 "rw_ios_per_sec": 0, 00:14:29.135 "rw_mbytes_per_sec": 0, 00:14:29.135 "r_mbytes_per_sec": 0, 00:14:29.135 "w_mbytes_per_sec": 0 00:14:29.135 }, 00:14:29.135 "claimed": true, 00:14:29.135 "claim_type": "exclusive_write", 00:14:29.135 "zoned": false, 00:14:29.135 "supported_io_types": { 00:14:29.135 "read": true, 00:14:29.135 "write": true, 00:14:29.135 "unmap": true, 00:14:29.135 "flush": true, 00:14:29.135 "reset": true, 00:14:29.135 "nvme_admin": false, 00:14:29.135 "nvme_io": false, 00:14:29.135 "nvme_io_md": false, 00:14:29.135 "write_zeroes": true, 00:14:29.135 "zcopy": true, 00:14:29.135 "get_zone_info": false, 00:14:29.135 "zone_management": false, 00:14:29.135 "zone_append": false, 00:14:29.135 "compare": false, 00:14:29.135 "compare_and_write": false, 00:14:29.135 "abort": true, 00:14:29.135 "seek_hole": false, 00:14:29.135 "seek_data": false, 00:14:29.135 "copy": true, 00:14:29.135 "nvme_iov_md": false 00:14:29.135 }, 00:14:29.135 "memory_domains": [ 00:14:29.135 { 00:14:29.135 "dma_device_id": "system", 00:14:29.135 "dma_device_type": 1 00:14:29.135 }, 00:14:29.135 { 00:14:29.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.135 "dma_device_type": 2 00:14:29.135 } 00:14:29.135 ], 00:14:29.135 "driver_specific": {} 00:14:29.135 }' 00:14:29.135 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.136 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:29.395 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:29.654 "name": "BaseBdev2", 00:14:29.654 "aliases": [ 00:14:29.654 "e36674d3-fb10-4a52-b5fe-36813433d223" 00:14:29.654 ], 00:14:29.654 "product_name": "Malloc disk", 00:14:29.654 "block_size": 512, 00:14:29.654 "num_blocks": 65536, 00:14:29.654 "uuid": "e36674d3-fb10-4a52-b5fe-36813433d223", 00:14:29.654 "assigned_rate_limits": { 00:14:29.654 "rw_ios_per_sec": 0, 00:14:29.654 "rw_mbytes_per_sec": 0, 00:14:29.654 "r_mbytes_per_sec": 0, 00:14:29.654 "w_mbytes_per_sec": 0 00:14:29.654 }, 00:14:29.654 "claimed": true, 00:14:29.654 "claim_type": "exclusive_write", 00:14:29.654 "zoned": false, 00:14:29.654 "supported_io_types": { 00:14:29.654 "read": true, 00:14:29.654 "write": true, 00:14:29.654 "unmap": true, 00:14:29.654 "flush": true, 00:14:29.654 "reset": true, 00:14:29.654 "nvme_admin": false, 00:14:29.654 "nvme_io": false, 00:14:29.654 "nvme_io_md": false, 00:14:29.654 "write_zeroes": true, 00:14:29.654 "zcopy": true, 00:14:29.654 "get_zone_info": false, 00:14:29.654 "zone_management": false, 00:14:29.654 "zone_append": false, 00:14:29.654 "compare": false, 00:14:29.654 "compare_and_write": false, 00:14:29.654 "abort": true, 00:14:29.654 "seek_hole": false, 00:14:29.654 "seek_data": false, 00:14:29.654 "copy": true, 00:14:29.654 "nvme_iov_md": false 00:14:29.654 }, 00:14:29.654 "memory_domains": [ 00:14:29.654 { 00:14:29.654 "dma_device_id": "system", 00:14:29.654 "dma_device_type": 1 00:14:29.654 }, 00:14:29.654 { 00:14:29.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.654 "dma_device_type": 2 00:14:29.654 } 00:14:29.654 ], 00:14:29.654 "driver_specific": {} 00:14:29.654 }' 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.654 13:33:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.654 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.654 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.654 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:29.913 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:30.173 "name": "BaseBdev3", 00:14:30.173 "aliases": [ 00:14:30.173 "1e9fa3b2-3da9-48ce-a035-a074ff48c62a" 00:14:30.173 ], 00:14:30.173 "product_name": "Malloc disk", 00:14:30.173 "block_size": 512, 00:14:30.173 "num_blocks": 65536, 00:14:30.173 "uuid": "1e9fa3b2-3da9-48ce-a035-a074ff48c62a", 00:14:30.173 "assigned_rate_limits": { 00:14:30.173 "rw_ios_per_sec": 0, 00:14:30.173 "rw_mbytes_per_sec": 0, 00:14:30.173 "r_mbytes_per_sec": 0, 00:14:30.173 "w_mbytes_per_sec": 0 00:14:30.173 }, 00:14:30.173 "claimed": true, 00:14:30.173 "claim_type": "exclusive_write", 00:14:30.173 "zoned": false, 00:14:30.173 "supported_io_types": { 00:14:30.173 "read": true, 00:14:30.173 "write": true, 00:14:30.173 "unmap": true, 00:14:30.173 "flush": true, 00:14:30.173 "reset": true, 00:14:30.173 "nvme_admin": false, 00:14:30.173 "nvme_io": false, 00:14:30.173 "nvme_io_md": false, 00:14:30.173 "write_zeroes": true, 00:14:30.173 "zcopy": true, 00:14:30.173 "get_zone_info": false, 00:14:30.173 "zone_management": false, 00:14:30.173 "zone_append": false, 00:14:30.173 "compare": false, 00:14:30.173 "compare_and_write": false, 00:14:30.173 "abort": true, 00:14:30.173 "seek_hole": false, 00:14:30.173 "seek_data": false, 00:14:30.173 "copy": true, 00:14:30.173 "nvme_iov_md": false 00:14:30.173 }, 00:14:30.173 "memory_domains": [ 00:14:30.173 { 00:14:30.173 "dma_device_id": "system", 00:14:30.173 "dma_device_type": 1 00:14:30.173 }, 00:14:30.173 { 00:14:30.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.173 "dma_device_type": 2 00:14:30.173 } 00:14:30.173 ], 00:14:30.173 "driver_specific": {} 00:14:30.173 }' 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.173 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:30.433 13:33:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:30.692 [2024-07-15 13:33:10.010859] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:30.692 [2024-07-15 13:33:10.010887] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:30.692 [2024-07-15 13:33:10.010954] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:30.692 [2024-07-15 13:33:10.011006] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:30.692 [2024-07-15 13:33:10.011018] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228c450 name Existed_Raid, state offline 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2100203 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2100203 ']' 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2100203 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2100203 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2100203' 00:14:30.692 killing process with pid 2100203 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2100203 00:14:30.692 [2024-07-15 13:33:10.098718] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:30.692 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2100203 00:14:30.952 [2024-07-15 13:33:10.125403] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:30.952 13:33:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:30.952 00:14:30.952 real 0m29.318s 00:14:30.952 user 0m53.640s 00:14:30.952 sys 0m5.289s 00:14:30.952 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:30.952 13:33:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.952 ************************************ 00:14:30.952 END TEST raid_state_function_test 00:14:30.952 ************************************ 00:14:31.212 13:33:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:31.212 13:33:10 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:14:31.212 13:33:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:31.212 13:33:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:31.212 13:33:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:31.212 ************************************ 00:14:31.212 START TEST raid_state_function_test_sb 00:14:31.212 ************************************ 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2105177 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2105177' 00:14:31.212 Process raid pid: 2105177 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2105177 /var/tmp/spdk-raid.sock 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2105177 ']' 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:31.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:31.212 13:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.212 [2024-07-15 13:33:10.501467] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:14:31.213 [2024-07-15 13:33:10.501539] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:31.213 [2024-07-15 13:33:10.621418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:31.472 [2024-07-15 13:33:10.723378] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:31.472 [2024-07-15 13:33:10.791148] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:31.472 [2024-07-15 13:33:10.791186] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.039 13:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:32.039 13:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:32.039 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:32.299 [2024-07-15 13:33:11.574269] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:32.299 [2024-07-15 13:33:11.574312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:32.299 [2024-07-15 13:33:11.574328] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:32.299 [2024-07-15 13:33:11.574341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:32.299 [2024-07-15 13:33:11.574350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:32.299 [2024-07-15 13:33:11.574363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.299 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.557 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.557 "name": "Existed_Raid", 00:14:32.557 "uuid": "b5efef03-7ae7-47b0-95d4-24b9d5eb8fd1", 00:14:32.557 "strip_size_kb": 64, 00:14:32.557 "state": "configuring", 00:14:32.557 "raid_level": "concat", 00:14:32.557 "superblock": true, 00:14:32.557 "num_base_bdevs": 3, 00:14:32.557 "num_base_bdevs_discovered": 0, 00:14:32.557 "num_base_bdevs_operational": 3, 00:14:32.557 "base_bdevs_list": [ 00:14:32.557 { 00:14:32.557 "name": "BaseBdev1", 00:14:32.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.557 "is_configured": false, 00:14:32.557 "data_offset": 0, 00:14:32.557 "data_size": 0 00:14:32.557 }, 00:14:32.557 { 00:14:32.557 "name": "BaseBdev2", 00:14:32.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.557 "is_configured": false, 00:14:32.557 "data_offset": 0, 00:14:32.557 "data_size": 0 00:14:32.557 }, 00:14:32.557 { 00:14:32.557 "name": "BaseBdev3", 00:14:32.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.557 "is_configured": false, 00:14:32.557 "data_offset": 0, 00:14:32.557 "data_size": 0 00:14:32.557 } 00:14:32.557 ] 00:14:32.557 }' 00:14:32.557 13:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.557 13:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.123 13:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:33.380 [2024-07-15 13:33:12.656978] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:33.380 [2024-07-15 13:33:12.657009] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dca80 name Existed_Raid, state configuring 00:14:33.380 13:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:33.638 [2024-07-15 13:33:12.901654] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:33.638 [2024-07-15 13:33:12.901685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:33.638 [2024-07-15 13:33:12.901694] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:33.638 [2024-07-15 13:33:12.901706] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:33.638 [2024-07-15 13:33:12.901723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:33.638 [2024-07-15 13:33:12.901734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:33.638 13:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:33.894 [2024-07-15 13:33:13.087950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:33.894 BaseBdev1 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:33.894 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.152 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:34.409 [ 00:14:34.409 { 00:14:34.409 "name": "BaseBdev1", 00:14:34.409 "aliases": [ 00:14:34.409 "36abb487-872e-4a33-b1af-6e0fc9bf4c11" 00:14:34.409 ], 00:14:34.409 "product_name": "Malloc disk", 00:14:34.409 "block_size": 512, 00:14:34.409 "num_blocks": 65536, 00:14:34.409 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:34.409 "assigned_rate_limits": { 00:14:34.409 "rw_ios_per_sec": 0, 00:14:34.409 "rw_mbytes_per_sec": 0, 00:14:34.409 "r_mbytes_per_sec": 0, 00:14:34.409 "w_mbytes_per_sec": 0 00:14:34.409 }, 00:14:34.409 "claimed": true, 00:14:34.409 "claim_type": "exclusive_write", 00:14:34.409 "zoned": false, 00:14:34.409 "supported_io_types": { 00:14:34.409 "read": true, 00:14:34.409 "write": true, 00:14:34.409 "unmap": true, 00:14:34.409 "flush": true, 00:14:34.409 "reset": true, 00:14:34.409 "nvme_admin": false, 00:14:34.409 "nvme_io": false, 00:14:34.409 "nvme_io_md": false, 00:14:34.409 "write_zeroes": true, 00:14:34.409 "zcopy": true, 00:14:34.409 "get_zone_info": false, 00:14:34.409 "zone_management": false, 00:14:34.409 "zone_append": false, 00:14:34.409 "compare": false, 00:14:34.410 "compare_and_write": false, 00:14:34.410 "abort": true, 00:14:34.410 "seek_hole": false, 00:14:34.410 "seek_data": false, 00:14:34.410 "copy": true, 00:14:34.410 "nvme_iov_md": false 00:14:34.410 }, 00:14:34.410 "memory_domains": [ 00:14:34.410 { 00:14:34.410 "dma_device_id": "system", 00:14:34.410 "dma_device_type": 1 00:14:34.410 }, 00:14:34.410 { 00:14:34.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.410 "dma_device_type": 2 00:14:34.410 } 00:14:34.410 ], 00:14:34.410 "driver_specific": {} 00:14:34.410 } 00:14:34.410 ] 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.410 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.668 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.668 "name": "Existed_Raid", 00:14:34.668 "uuid": "32b28ab1-e8fa-4d78-ab31-0f244338dd11", 00:14:34.668 "strip_size_kb": 64, 00:14:34.668 "state": "configuring", 00:14:34.668 "raid_level": "concat", 00:14:34.668 "superblock": true, 00:14:34.668 "num_base_bdevs": 3, 00:14:34.668 "num_base_bdevs_discovered": 1, 00:14:34.668 "num_base_bdevs_operational": 3, 00:14:34.668 "base_bdevs_list": [ 00:14:34.668 { 00:14:34.668 "name": "BaseBdev1", 00:14:34.668 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:34.668 "is_configured": true, 00:14:34.668 "data_offset": 2048, 00:14:34.668 "data_size": 63488 00:14:34.668 }, 00:14:34.668 { 00:14:34.668 "name": "BaseBdev2", 00:14:34.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.668 "is_configured": false, 00:14:34.668 "data_offset": 0, 00:14:34.668 "data_size": 0 00:14:34.668 }, 00:14:34.668 { 00:14:34.668 "name": "BaseBdev3", 00:14:34.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.668 "is_configured": false, 00:14:34.668 "data_offset": 0, 00:14:34.668 "data_size": 0 00:14:34.668 } 00:14:34.668 ] 00:14:34.668 }' 00:14:34.668 13:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.668 13:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.233 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:35.233 [2024-07-15 13:33:14.644063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:35.233 [2024-07-15 13:33:14.644104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dc310 name Existed_Raid, state configuring 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:35.492 [2024-07-15 13:33:14.884746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:35.492 [2024-07-15 13:33:14.886245] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:35.492 [2024-07-15 13:33:14.886278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:35.492 [2024-07-15 13:33:14.886288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:35.492 [2024-07-15 13:33:14.886300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.492 13:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.750 13:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.750 "name": "Existed_Raid", 00:14:35.750 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:35.750 "strip_size_kb": 64, 00:14:35.750 "state": "configuring", 00:14:35.750 "raid_level": "concat", 00:14:35.750 "superblock": true, 00:14:35.750 "num_base_bdevs": 3, 00:14:35.750 "num_base_bdevs_discovered": 1, 00:14:35.750 "num_base_bdevs_operational": 3, 00:14:35.750 "base_bdevs_list": [ 00:14:35.750 { 00:14:35.750 "name": "BaseBdev1", 00:14:35.750 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:35.750 "is_configured": true, 00:14:35.750 "data_offset": 2048, 00:14:35.750 "data_size": 63488 00:14:35.750 }, 00:14:35.750 { 00:14:35.750 "name": "BaseBdev2", 00:14:35.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.750 "is_configured": false, 00:14:35.750 "data_offset": 0, 00:14:35.750 "data_size": 0 00:14:35.750 }, 00:14:35.750 { 00:14:35.750 "name": "BaseBdev3", 00:14:35.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.750 "is_configured": false, 00:14:35.750 "data_offset": 0, 00:14:35.751 "data_size": 0 00:14:35.751 } 00:14:35.751 ] 00:14:35.751 }' 00:14:35.751 13:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.751 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:36.318 13:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:36.578 [2024-07-15 13:33:15.963062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:36.578 BaseBdev2 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:36.578 13:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.837 13:33:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:37.096 [ 00:14:37.096 { 00:14:37.096 "name": "BaseBdev2", 00:14:37.096 "aliases": [ 00:14:37.096 "09121ef5-c4e4-46b2-983b-6a7650e1556c" 00:14:37.096 ], 00:14:37.096 "product_name": "Malloc disk", 00:14:37.096 "block_size": 512, 00:14:37.096 "num_blocks": 65536, 00:14:37.096 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:37.096 "assigned_rate_limits": { 00:14:37.096 "rw_ios_per_sec": 0, 00:14:37.096 "rw_mbytes_per_sec": 0, 00:14:37.096 "r_mbytes_per_sec": 0, 00:14:37.096 "w_mbytes_per_sec": 0 00:14:37.096 }, 00:14:37.096 "claimed": true, 00:14:37.096 "claim_type": "exclusive_write", 00:14:37.096 "zoned": false, 00:14:37.096 "supported_io_types": { 00:14:37.096 "read": true, 00:14:37.096 "write": true, 00:14:37.096 "unmap": true, 00:14:37.097 "flush": true, 00:14:37.097 "reset": true, 00:14:37.097 "nvme_admin": false, 00:14:37.097 "nvme_io": false, 00:14:37.097 "nvme_io_md": false, 00:14:37.097 "write_zeroes": true, 00:14:37.097 "zcopy": true, 00:14:37.097 "get_zone_info": false, 00:14:37.097 "zone_management": false, 00:14:37.097 "zone_append": false, 00:14:37.097 "compare": false, 00:14:37.097 "compare_and_write": false, 00:14:37.097 "abort": true, 00:14:37.097 "seek_hole": false, 00:14:37.097 "seek_data": false, 00:14:37.097 "copy": true, 00:14:37.097 "nvme_iov_md": false 00:14:37.097 }, 00:14:37.097 "memory_domains": [ 00:14:37.097 { 00:14:37.097 "dma_device_id": "system", 00:14:37.097 "dma_device_type": 1 00:14:37.097 }, 00:14:37.097 { 00:14:37.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.097 "dma_device_type": 2 00:14:37.097 } 00:14:37.097 ], 00:14:37.097 "driver_specific": {} 00:14:37.097 } 00:14:37.097 ] 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.097 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.356 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.356 "name": "Existed_Raid", 00:14:37.356 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:37.356 "strip_size_kb": 64, 00:14:37.356 "state": "configuring", 00:14:37.356 "raid_level": "concat", 00:14:37.356 "superblock": true, 00:14:37.356 "num_base_bdevs": 3, 00:14:37.356 "num_base_bdevs_discovered": 2, 00:14:37.356 "num_base_bdevs_operational": 3, 00:14:37.356 "base_bdevs_list": [ 00:14:37.356 { 00:14:37.356 "name": "BaseBdev1", 00:14:37.356 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:37.356 "is_configured": true, 00:14:37.356 "data_offset": 2048, 00:14:37.356 "data_size": 63488 00:14:37.356 }, 00:14:37.356 { 00:14:37.356 "name": "BaseBdev2", 00:14:37.356 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:37.356 "is_configured": true, 00:14:37.356 "data_offset": 2048, 00:14:37.356 "data_size": 63488 00:14:37.356 }, 00:14:37.356 { 00:14:37.356 "name": "BaseBdev3", 00:14:37.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.356 "is_configured": false, 00:14:37.356 "data_offset": 0, 00:14:37.356 "data_size": 0 00:14:37.356 } 00:14:37.356 ] 00:14:37.356 }' 00:14:37.356 13:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.356 13:33:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.924 13:33:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:38.183 [2024-07-15 13:33:17.570775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:38.183 [2024-07-15 13:33:17.570947] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17dd400 00:14:38.183 [2024-07-15 13:33:17.570961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:38.183 [2024-07-15 13:33:17.571131] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17dcef0 00:14:38.183 [2024-07-15 13:33:17.571244] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17dd400 00:14:38.183 [2024-07-15 13:33:17.571254] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17dd400 00:14:38.183 [2024-07-15 13:33:17.571342] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:38.183 BaseBdev3 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.183 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.442 13:33:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:38.700 [ 00:14:38.700 { 00:14:38.700 "name": "BaseBdev3", 00:14:38.700 "aliases": [ 00:14:38.700 "62f433c4-d81e-45ac-b88f-2e1b08986c6b" 00:14:38.700 ], 00:14:38.700 "product_name": "Malloc disk", 00:14:38.700 "block_size": 512, 00:14:38.700 "num_blocks": 65536, 00:14:38.700 "uuid": "62f433c4-d81e-45ac-b88f-2e1b08986c6b", 00:14:38.700 "assigned_rate_limits": { 00:14:38.700 "rw_ios_per_sec": 0, 00:14:38.700 "rw_mbytes_per_sec": 0, 00:14:38.700 "r_mbytes_per_sec": 0, 00:14:38.700 "w_mbytes_per_sec": 0 00:14:38.700 }, 00:14:38.700 "claimed": true, 00:14:38.700 "claim_type": "exclusive_write", 00:14:38.700 "zoned": false, 00:14:38.700 "supported_io_types": { 00:14:38.700 "read": true, 00:14:38.700 "write": true, 00:14:38.700 "unmap": true, 00:14:38.700 "flush": true, 00:14:38.700 "reset": true, 00:14:38.700 "nvme_admin": false, 00:14:38.700 "nvme_io": false, 00:14:38.700 "nvme_io_md": false, 00:14:38.700 "write_zeroes": true, 00:14:38.700 "zcopy": true, 00:14:38.700 "get_zone_info": false, 00:14:38.700 "zone_management": false, 00:14:38.700 "zone_append": false, 00:14:38.700 "compare": false, 00:14:38.700 "compare_and_write": false, 00:14:38.700 "abort": true, 00:14:38.700 "seek_hole": false, 00:14:38.700 "seek_data": false, 00:14:38.700 "copy": true, 00:14:38.700 "nvme_iov_md": false 00:14:38.700 }, 00:14:38.700 "memory_domains": [ 00:14:38.700 { 00:14:38.700 "dma_device_id": "system", 00:14:38.700 "dma_device_type": 1 00:14:38.700 }, 00:14:38.700 { 00:14:38.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.700 "dma_device_type": 2 00:14:38.700 } 00:14:38.700 ], 00:14:38.700 "driver_specific": {} 00:14:38.700 } 00:14:38.700 ] 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:38.700 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:38.701 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:38.701 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.701 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.959 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.959 "name": "Existed_Raid", 00:14:38.959 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:38.959 "strip_size_kb": 64, 00:14:38.959 "state": "online", 00:14:38.959 "raid_level": "concat", 00:14:38.959 "superblock": true, 00:14:38.959 "num_base_bdevs": 3, 00:14:38.959 "num_base_bdevs_discovered": 3, 00:14:38.959 "num_base_bdevs_operational": 3, 00:14:38.959 "base_bdevs_list": [ 00:14:38.959 { 00:14:38.959 "name": "BaseBdev1", 00:14:38.959 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:38.959 "is_configured": true, 00:14:38.959 "data_offset": 2048, 00:14:38.959 "data_size": 63488 00:14:38.959 }, 00:14:38.959 { 00:14:38.959 "name": "BaseBdev2", 00:14:38.959 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:38.959 "is_configured": true, 00:14:38.960 "data_offset": 2048, 00:14:38.960 "data_size": 63488 00:14:38.960 }, 00:14:38.960 { 00:14:38.960 "name": "BaseBdev3", 00:14:38.960 "uuid": "62f433c4-d81e-45ac-b88f-2e1b08986c6b", 00:14:38.960 "is_configured": true, 00:14:38.960 "data_offset": 2048, 00:14:38.960 "data_size": 63488 00:14:38.960 } 00:14:38.960 ] 00:14:38.960 }' 00:14:38.960 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.960 13:33:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:39.528 13:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:39.787 [2024-07-15 13:33:19.139240] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:39.787 "name": "Existed_Raid", 00:14:39.787 "aliases": [ 00:14:39.787 "1dc65161-e56d-4f88-a896-18477171c349" 00:14:39.787 ], 00:14:39.787 "product_name": "Raid Volume", 00:14:39.787 "block_size": 512, 00:14:39.787 "num_blocks": 190464, 00:14:39.787 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:39.787 "assigned_rate_limits": { 00:14:39.787 "rw_ios_per_sec": 0, 00:14:39.787 "rw_mbytes_per_sec": 0, 00:14:39.787 "r_mbytes_per_sec": 0, 00:14:39.787 "w_mbytes_per_sec": 0 00:14:39.787 }, 00:14:39.787 "claimed": false, 00:14:39.787 "zoned": false, 00:14:39.787 "supported_io_types": { 00:14:39.787 "read": true, 00:14:39.787 "write": true, 00:14:39.787 "unmap": true, 00:14:39.787 "flush": true, 00:14:39.787 "reset": true, 00:14:39.787 "nvme_admin": false, 00:14:39.787 "nvme_io": false, 00:14:39.787 "nvme_io_md": false, 00:14:39.787 "write_zeroes": true, 00:14:39.787 "zcopy": false, 00:14:39.787 "get_zone_info": false, 00:14:39.787 "zone_management": false, 00:14:39.787 "zone_append": false, 00:14:39.787 "compare": false, 00:14:39.787 "compare_and_write": false, 00:14:39.787 "abort": false, 00:14:39.787 "seek_hole": false, 00:14:39.787 "seek_data": false, 00:14:39.787 "copy": false, 00:14:39.787 "nvme_iov_md": false 00:14:39.787 }, 00:14:39.787 "memory_domains": [ 00:14:39.787 { 00:14:39.787 "dma_device_id": "system", 00:14:39.787 "dma_device_type": 1 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.787 "dma_device_type": 2 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "dma_device_id": "system", 00:14:39.787 "dma_device_type": 1 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.787 "dma_device_type": 2 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "dma_device_id": "system", 00:14:39.787 "dma_device_type": 1 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.787 "dma_device_type": 2 00:14:39.787 } 00:14:39.787 ], 00:14:39.787 "driver_specific": { 00:14:39.787 "raid": { 00:14:39.787 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:39.787 "strip_size_kb": 64, 00:14:39.787 "state": "online", 00:14:39.787 "raid_level": "concat", 00:14:39.787 "superblock": true, 00:14:39.787 "num_base_bdevs": 3, 00:14:39.787 "num_base_bdevs_discovered": 3, 00:14:39.787 "num_base_bdevs_operational": 3, 00:14:39.787 "base_bdevs_list": [ 00:14:39.787 { 00:14:39.787 "name": "BaseBdev1", 00:14:39.787 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:39.787 "is_configured": true, 00:14:39.787 "data_offset": 2048, 00:14:39.787 "data_size": 63488 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "name": "BaseBdev2", 00:14:39.787 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:39.787 "is_configured": true, 00:14:39.787 "data_offset": 2048, 00:14:39.787 "data_size": 63488 00:14:39.787 }, 00:14:39.787 { 00:14:39.787 "name": "BaseBdev3", 00:14:39.787 "uuid": "62f433c4-d81e-45ac-b88f-2e1b08986c6b", 00:14:39.787 "is_configured": true, 00:14:39.787 "data_offset": 2048, 00:14:39.787 "data_size": 63488 00:14:39.787 } 00:14:39.787 ] 00:14:39.787 } 00:14:39.787 } 00:14:39.787 }' 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:39.787 BaseBdev2 00:14:39.787 BaseBdev3' 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:39.787 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.048 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.048 "name": "BaseBdev1", 00:14:40.048 "aliases": [ 00:14:40.048 "36abb487-872e-4a33-b1af-6e0fc9bf4c11" 00:14:40.048 ], 00:14:40.048 "product_name": "Malloc disk", 00:14:40.048 "block_size": 512, 00:14:40.048 "num_blocks": 65536, 00:14:40.048 "uuid": "36abb487-872e-4a33-b1af-6e0fc9bf4c11", 00:14:40.048 "assigned_rate_limits": { 00:14:40.048 "rw_ios_per_sec": 0, 00:14:40.048 "rw_mbytes_per_sec": 0, 00:14:40.048 "r_mbytes_per_sec": 0, 00:14:40.048 "w_mbytes_per_sec": 0 00:14:40.048 }, 00:14:40.048 "claimed": true, 00:14:40.048 "claim_type": "exclusive_write", 00:14:40.048 "zoned": false, 00:14:40.048 "supported_io_types": { 00:14:40.048 "read": true, 00:14:40.048 "write": true, 00:14:40.048 "unmap": true, 00:14:40.048 "flush": true, 00:14:40.048 "reset": true, 00:14:40.048 "nvme_admin": false, 00:14:40.048 "nvme_io": false, 00:14:40.048 "nvme_io_md": false, 00:14:40.048 "write_zeroes": true, 00:14:40.048 "zcopy": true, 00:14:40.048 "get_zone_info": false, 00:14:40.048 "zone_management": false, 00:14:40.048 "zone_append": false, 00:14:40.048 "compare": false, 00:14:40.048 "compare_and_write": false, 00:14:40.048 "abort": true, 00:14:40.048 "seek_hole": false, 00:14:40.048 "seek_data": false, 00:14:40.048 "copy": true, 00:14:40.048 "nvme_iov_md": false 00:14:40.048 }, 00:14:40.048 "memory_domains": [ 00:14:40.048 { 00:14:40.048 "dma_device_id": "system", 00:14:40.048 "dma_device_type": 1 00:14:40.048 }, 00:14:40.048 { 00:14:40.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.048 "dma_device_type": 2 00:14:40.048 } 00:14:40.048 ], 00:14:40.048 "driver_specific": {} 00:14:40.048 }' 00:14:40.048 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.347 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.606 "name": "BaseBdev2", 00:14:40.606 "aliases": [ 00:14:40.606 "09121ef5-c4e4-46b2-983b-6a7650e1556c" 00:14:40.606 ], 00:14:40.606 "product_name": "Malloc disk", 00:14:40.606 "block_size": 512, 00:14:40.606 "num_blocks": 65536, 00:14:40.606 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:40.606 "assigned_rate_limits": { 00:14:40.606 "rw_ios_per_sec": 0, 00:14:40.606 "rw_mbytes_per_sec": 0, 00:14:40.606 "r_mbytes_per_sec": 0, 00:14:40.606 "w_mbytes_per_sec": 0 00:14:40.606 }, 00:14:40.606 "claimed": true, 00:14:40.606 "claim_type": "exclusive_write", 00:14:40.606 "zoned": false, 00:14:40.606 "supported_io_types": { 00:14:40.606 "read": true, 00:14:40.606 "write": true, 00:14:40.606 "unmap": true, 00:14:40.606 "flush": true, 00:14:40.606 "reset": true, 00:14:40.606 "nvme_admin": false, 00:14:40.606 "nvme_io": false, 00:14:40.606 "nvme_io_md": false, 00:14:40.606 "write_zeroes": true, 00:14:40.606 "zcopy": true, 00:14:40.606 "get_zone_info": false, 00:14:40.606 "zone_management": false, 00:14:40.606 "zone_append": false, 00:14:40.606 "compare": false, 00:14:40.606 "compare_and_write": false, 00:14:40.606 "abort": true, 00:14:40.606 "seek_hole": false, 00:14:40.606 "seek_data": false, 00:14:40.606 "copy": true, 00:14:40.606 "nvme_iov_md": false 00:14:40.606 }, 00:14:40.606 "memory_domains": [ 00:14:40.606 { 00:14:40.606 "dma_device_id": "system", 00:14:40.606 "dma_device_type": 1 00:14:40.606 }, 00:14:40.606 { 00:14:40.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.606 "dma_device_type": 2 00:14:40.606 } 00:14:40.606 ], 00:14:40.606 "driver_specific": {} 00:14:40.606 }' 00:14:40.606 13:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.606 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.865 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.124 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.124 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.124 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.124 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:41.124 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.382 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.382 "name": "BaseBdev3", 00:14:41.382 "aliases": [ 00:14:41.382 "62f433c4-d81e-45ac-b88f-2e1b08986c6b" 00:14:41.382 ], 00:14:41.382 "product_name": "Malloc disk", 00:14:41.382 "block_size": 512, 00:14:41.382 "num_blocks": 65536, 00:14:41.382 "uuid": "62f433c4-d81e-45ac-b88f-2e1b08986c6b", 00:14:41.382 "assigned_rate_limits": { 00:14:41.382 "rw_ios_per_sec": 0, 00:14:41.382 "rw_mbytes_per_sec": 0, 00:14:41.382 "r_mbytes_per_sec": 0, 00:14:41.383 "w_mbytes_per_sec": 0 00:14:41.383 }, 00:14:41.383 "claimed": true, 00:14:41.383 "claim_type": "exclusive_write", 00:14:41.383 "zoned": false, 00:14:41.383 "supported_io_types": { 00:14:41.383 "read": true, 00:14:41.383 "write": true, 00:14:41.383 "unmap": true, 00:14:41.383 "flush": true, 00:14:41.383 "reset": true, 00:14:41.383 "nvme_admin": false, 00:14:41.383 "nvme_io": false, 00:14:41.383 "nvme_io_md": false, 00:14:41.383 "write_zeroes": true, 00:14:41.383 "zcopy": true, 00:14:41.383 "get_zone_info": false, 00:14:41.383 "zone_management": false, 00:14:41.383 "zone_append": false, 00:14:41.383 "compare": false, 00:14:41.383 "compare_and_write": false, 00:14:41.383 "abort": true, 00:14:41.383 "seek_hole": false, 00:14:41.383 "seek_data": false, 00:14:41.383 "copy": true, 00:14:41.383 "nvme_iov_md": false 00:14:41.383 }, 00:14:41.383 "memory_domains": [ 00:14:41.383 { 00:14:41.383 "dma_device_id": "system", 00:14:41.383 "dma_device_type": 1 00:14:41.383 }, 00:14:41.383 { 00:14:41.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.383 "dma_device_type": 2 00:14:41.383 } 00:14:41.383 ], 00:14:41.383 "driver_specific": {} 00:14:41.383 }' 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.383 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.641 13:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:41.900 [2024-07-15 13:33:21.168371] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:41.900 [2024-07-15 13:33:21.168398] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:41.901 [2024-07-15 13:33:21.168437] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.901 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.160 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.160 "name": "Existed_Raid", 00:14:42.160 "uuid": "1dc65161-e56d-4f88-a896-18477171c349", 00:14:42.160 "strip_size_kb": 64, 00:14:42.160 "state": "offline", 00:14:42.160 "raid_level": "concat", 00:14:42.160 "superblock": true, 00:14:42.160 "num_base_bdevs": 3, 00:14:42.160 "num_base_bdevs_discovered": 2, 00:14:42.160 "num_base_bdevs_operational": 2, 00:14:42.160 "base_bdevs_list": [ 00:14:42.160 { 00:14:42.160 "name": null, 00:14:42.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.160 "is_configured": false, 00:14:42.160 "data_offset": 2048, 00:14:42.160 "data_size": 63488 00:14:42.160 }, 00:14:42.160 { 00:14:42.160 "name": "BaseBdev2", 00:14:42.160 "uuid": "09121ef5-c4e4-46b2-983b-6a7650e1556c", 00:14:42.160 "is_configured": true, 00:14:42.160 "data_offset": 2048, 00:14:42.160 "data_size": 63488 00:14:42.160 }, 00:14:42.160 { 00:14:42.160 "name": "BaseBdev3", 00:14:42.160 "uuid": "62f433c4-d81e-45ac-b88f-2e1b08986c6b", 00:14:42.160 "is_configured": true, 00:14:42.160 "data_offset": 2048, 00:14:42.160 "data_size": 63488 00:14:42.160 } 00:14:42.160 ] 00:14:42.160 }' 00:14:42.160 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.160 13:33:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.727 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:42.727 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:42.727 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:42.727 13:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.986 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:42.986 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:42.986 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:43.246 [2024-07-15 13:33:22.440760] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:43.246 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:43.505 [2024-07-15 13:33:22.890301] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:43.505 [2024-07-15 13:33:22.890341] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dd400 name Existed_Raid, state offline 00:14:43.505 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.505 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.505 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.505 13:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:43.763 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:44.021 BaseBdev2 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.021 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:44.279 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:44.537 [ 00:14:44.537 { 00:14:44.537 "name": "BaseBdev2", 00:14:44.537 "aliases": [ 00:14:44.538 "dcd2d044-2c04-49d7-99e3-d82b0686927f" 00:14:44.538 ], 00:14:44.538 "product_name": "Malloc disk", 00:14:44.538 "block_size": 512, 00:14:44.538 "num_blocks": 65536, 00:14:44.538 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:44.538 "assigned_rate_limits": { 00:14:44.538 "rw_ios_per_sec": 0, 00:14:44.538 "rw_mbytes_per_sec": 0, 00:14:44.538 "r_mbytes_per_sec": 0, 00:14:44.538 "w_mbytes_per_sec": 0 00:14:44.538 }, 00:14:44.538 "claimed": false, 00:14:44.538 "zoned": false, 00:14:44.538 "supported_io_types": { 00:14:44.538 "read": true, 00:14:44.538 "write": true, 00:14:44.538 "unmap": true, 00:14:44.538 "flush": true, 00:14:44.538 "reset": true, 00:14:44.538 "nvme_admin": false, 00:14:44.538 "nvme_io": false, 00:14:44.538 "nvme_io_md": false, 00:14:44.538 "write_zeroes": true, 00:14:44.538 "zcopy": true, 00:14:44.538 "get_zone_info": false, 00:14:44.538 "zone_management": false, 00:14:44.538 "zone_append": false, 00:14:44.538 "compare": false, 00:14:44.538 "compare_and_write": false, 00:14:44.538 "abort": true, 00:14:44.538 "seek_hole": false, 00:14:44.538 "seek_data": false, 00:14:44.538 "copy": true, 00:14:44.538 "nvme_iov_md": false 00:14:44.538 }, 00:14:44.538 "memory_domains": [ 00:14:44.538 { 00:14:44.538 "dma_device_id": "system", 00:14:44.538 "dma_device_type": 1 00:14:44.538 }, 00:14:44.538 { 00:14:44.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.538 "dma_device_type": 2 00:14:44.538 } 00:14:44.538 ], 00:14:44.538 "driver_specific": {} 00:14:44.538 } 00:14:44.538 ] 00:14:44.538 13:33:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:44.538 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:44.538 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:44.538 13:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:44.796 BaseBdev3 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.796 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.055 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:45.314 [ 00:14:45.314 { 00:14:45.314 "name": "BaseBdev3", 00:14:45.314 "aliases": [ 00:14:45.314 "14701bfe-c617-4a95-9175-8b5ea649136c" 00:14:45.314 ], 00:14:45.314 "product_name": "Malloc disk", 00:14:45.314 "block_size": 512, 00:14:45.314 "num_blocks": 65536, 00:14:45.314 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:45.314 "assigned_rate_limits": { 00:14:45.314 "rw_ios_per_sec": 0, 00:14:45.314 "rw_mbytes_per_sec": 0, 00:14:45.314 "r_mbytes_per_sec": 0, 00:14:45.314 "w_mbytes_per_sec": 0 00:14:45.314 }, 00:14:45.314 "claimed": false, 00:14:45.314 "zoned": false, 00:14:45.314 "supported_io_types": { 00:14:45.314 "read": true, 00:14:45.314 "write": true, 00:14:45.314 "unmap": true, 00:14:45.314 "flush": true, 00:14:45.314 "reset": true, 00:14:45.314 "nvme_admin": false, 00:14:45.314 "nvme_io": false, 00:14:45.314 "nvme_io_md": false, 00:14:45.314 "write_zeroes": true, 00:14:45.314 "zcopy": true, 00:14:45.314 "get_zone_info": false, 00:14:45.314 "zone_management": false, 00:14:45.314 "zone_append": false, 00:14:45.314 "compare": false, 00:14:45.314 "compare_and_write": false, 00:14:45.314 "abort": true, 00:14:45.314 "seek_hole": false, 00:14:45.314 "seek_data": false, 00:14:45.314 "copy": true, 00:14:45.314 "nvme_iov_md": false 00:14:45.314 }, 00:14:45.314 "memory_domains": [ 00:14:45.314 { 00:14:45.314 "dma_device_id": "system", 00:14:45.314 "dma_device_type": 1 00:14:45.314 }, 00:14:45.314 { 00:14:45.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.314 "dma_device_type": 2 00:14:45.314 } 00:14:45.314 ], 00:14:45.314 "driver_specific": {} 00:14:45.314 } 00:14:45.314 ] 00:14:45.314 13:33:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:45.314 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.314 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.314 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:45.574 [2024-07-15 13:33:24.858683] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:45.574 [2024-07-15 13:33:24.858727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:45.574 [2024-07-15 13:33:24.858745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:45.574 [2024-07-15 13:33:24.860056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.574 13:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.834 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.834 "name": "Existed_Raid", 00:14:45.834 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:45.834 "strip_size_kb": 64, 00:14:45.834 "state": "configuring", 00:14:45.834 "raid_level": "concat", 00:14:45.834 "superblock": true, 00:14:45.834 "num_base_bdevs": 3, 00:14:45.834 "num_base_bdevs_discovered": 2, 00:14:45.834 "num_base_bdevs_operational": 3, 00:14:45.834 "base_bdevs_list": [ 00:14:45.834 { 00:14:45.834 "name": "BaseBdev1", 00:14:45.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.834 "is_configured": false, 00:14:45.834 "data_offset": 0, 00:14:45.834 "data_size": 0 00:14:45.834 }, 00:14:45.834 { 00:14:45.834 "name": "BaseBdev2", 00:14:45.834 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:45.834 "is_configured": true, 00:14:45.834 "data_offset": 2048, 00:14:45.834 "data_size": 63488 00:14:45.834 }, 00:14:45.834 { 00:14:45.834 "name": "BaseBdev3", 00:14:45.834 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:45.834 "is_configured": true, 00:14:45.834 "data_offset": 2048, 00:14:45.834 "data_size": 63488 00:14:45.834 } 00:14:45.834 ] 00:14:45.834 }' 00:14:45.834 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.834 13:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.426 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:46.686 [2024-07-15 13:33:25.961558] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.686 13:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.945 13:33:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.945 "name": "Existed_Raid", 00:14:46.945 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:46.945 "strip_size_kb": 64, 00:14:46.945 "state": "configuring", 00:14:46.945 "raid_level": "concat", 00:14:46.945 "superblock": true, 00:14:46.945 "num_base_bdevs": 3, 00:14:46.945 "num_base_bdevs_discovered": 1, 00:14:46.945 "num_base_bdevs_operational": 3, 00:14:46.945 "base_bdevs_list": [ 00:14:46.945 { 00:14:46.945 "name": "BaseBdev1", 00:14:46.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.945 "is_configured": false, 00:14:46.945 "data_offset": 0, 00:14:46.945 "data_size": 0 00:14:46.945 }, 00:14:46.945 { 00:14:46.945 "name": null, 00:14:46.945 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:46.945 "is_configured": false, 00:14:46.945 "data_offset": 2048, 00:14:46.945 "data_size": 63488 00:14:46.945 }, 00:14:46.945 { 00:14:46.945 "name": "BaseBdev3", 00:14:46.945 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:46.945 "is_configured": true, 00:14:46.945 "data_offset": 2048, 00:14:46.945 "data_size": 63488 00:14:46.945 } 00:14:46.945 ] 00:14:46.945 }' 00:14:46.945 13:33:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.945 13:33:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.513 13:33:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.513 13:33:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:47.772 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:47.772 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.031 [2024-07-15 13:33:27.301654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.031 BaseBdev1 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.031 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.290 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:48.548 [ 00:14:48.548 { 00:14:48.548 "name": "BaseBdev1", 00:14:48.548 "aliases": [ 00:14:48.548 "6df39694-9f11-4189-aff0-e936909e4798" 00:14:48.548 ], 00:14:48.548 "product_name": "Malloc disk", 00:14:48.548 "block_size": 512, 00:14:48.548 "num_blocks": 65536, 00:14:48.548 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:48.548 "assigned_rate_limits": { 00:14:48.548 "rw_ios_per_sec": 0, 00:14:48.548 "rw_mbytes_per_sec": 0, 00:14:48.548 "r_mbytes_per_sec": 0, 00:14:48.548 "w_mbytes_per_sec": 0 00:14:48.548 }, 00:14:48.548 "claimed": true, 00:14:48.548 "claim_type": "exclusive_write", 00:14:48.548 "zoned": false, 00:14:48.548 "supported_io_types": { 00:14:48.548 "read": true, 00:14:48.548 "write": true, 00:14:48.548 "unmap": true, 00:14:48.548 "flush": true, 00:14:48.548 "reset": true, 00:14:48.548 "nvme_admin": false, 00:14:48.548 "nvme_io": false, 00:14:48.548 "nvme_io_md": false, 00:14:48.548 "write_zeroes": true, 00:14:48.548 "zcopy": true, 00:14:48.548 "get_zone_info": false, 00:14:48.548 "zone_management": false, 00:14:48.548 "zone_append": false, 00:14:48.548 "compare": false, 00:14:48.548 "compare_and_write": false, 00:14:48.548 "abort": true, 00:14:48.548 "seek_hole": false, 00:14:48.548 "seek_data": false, 00:14:48.548 "copy": true, 00:14:48.548 "nvme_iov_md": false 00:14:48.548 }, 00:14:48.548 "memory_domains": [ 00:14:48.548 { 00:14:48.548 "dma_device_id": "system", 00:14:48.548 "dma_device_type": 1 00:14:48.548 }, 00:14:48.548 { 00:14:48.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.548 "dma_device_type": 2 00:14:48.548 } 00:14:48.548 ], 00:14:48.548 "driver_specific": {} 00:14:48.548 } 00:14:48.548 ] 00:14:48.548 13:33:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:48.548 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.549 13:33:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.807 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.807 "name": "Existed_Raid", 00:14:48.807 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:48.807 "strip_size_kb": 64, 00:14:48.807 "state": "configuring", 00:14:48.807 "raid_level": "concat", 00:14:48.807 "superblock": true, 00:14:48.807 "num_base_bdevs": 3, 00:14:48.807 "num_base_bdevs_discovered": 2, 00:14:48.807 "num_base_bdevs_operational": 3, 00:14:48.807 "base_bdevs_list": [ 00:14:48.807 { 00:14:48.807 "name": "BaseBdev1", 00:14:48.807 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:48.807 "is_configured": true, 00:14:48.807 "data_offset": 2048, 00:14:48.807 "data_size": 63488 00:14:48.807 }, 00:14:48.807 { 00:14:48.807 "name": null, 00:14:48.807 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:48.807 "is_configured": false, 00:14:48.807 "data_offset": 2048, 00:14:48.807 "data_size": 63488 00:14:48.807 }, 00:14:48.807 { 00:14:48.807 "name": "BaseBdev3", 00:14:48.807 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:48.807 "is_configured": true, 00:14:48.807 "data_offset": 2048, 00:14:48.807 "data_size": 63488 00:14:48.807 } 00:14:48.807 ] 00:14:48.807 }' 00:14:48.807 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.807 13:33:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:49.373 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.373 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:49.631 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:49.631 13:33:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:49.631 [2024-07-15 13:33:29.050328] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.890 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.150 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.150 "name": "Existed_Raid", 00:14:50.150 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:50.150 "strip_size_kb": 64, 00:14:50.150 "state": "configuring", 00:14:50.150 "raid_level": "concat", 00:14:50.150 "superblock": true, 00:14:50.150 "num_base_bdevs": 3, 00:14:50.150 "num_base_bdevs_discovered": 1, 00:14:50.150 "num_base_bdevs_operational": 3, 00:14:50.150 "base_bdevs_list": [ 00:14:50.150 { 00:14:50.150 "name": "BaseBdev1", 00:14:50.150 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:50.150 "is_configured": true, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 }, 00:14:50.150 { 00:14:50.150 "name": null, 00:14:50.150 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:50.150 "is_configured": false, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 }, 00:14:50.150 { 00:14:50.150 "name": null, 00:14:50.150 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:50.150 "is_configured": false, 00:14:50.150 "data_offset": 2048, 00:14:50.150 "data_size": 63488 00:14:50.150 } 00:14:50.150 ] 00:14:50.150 }' 00:14:50.150 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.150 13:33:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.718 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.718 13:33:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:50.977 [2024-07-15 13:33:30.381883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.977 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.235 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.236 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.236 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.236 "name": "Existed_Raid", 00:14:51.236 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:51.236 "strip_size_kb": 64, 00:14:51.236 "state": "configuring", 00:14:51.236 "raid_level": "concat", 00:14:51.236 "superblock": true, 00:14:51.236 "num_base_bdevs": 3, 00:14:51.236 "num_base_bdevs_discovered": 2, 00:14:51.236 "num_base_bdevs_operational": 3, 00:14:51.236 "base_bdevs_list": [ 00:14:51.236 { 00:14:51.236 "name": "BaseBdev1", 00:14:51.236 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:51.236 "is_configured": true, 00:14:51.236 "data_offset": 2048, 00:14:51.236 "data_size": 63488 00:14:51.236 }, 00:14:51.236 { 00:14:51.236 "name": null, 00:14:51.236 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:51.236 "is_configured": false, 00:14:51.236 "data_offset": 2048, 00:14:51.236 "data_size": 63488 00:14:51.236 }, 00:14:51.236 { 00:14:51.236 "name": "BaseBdev3", 00:14:51.236 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:51.236 "is_configured": true, 00:14:51.236 "data_offset": 2048, 00:14:51.236 "data_size": 63488 00:14:51.236 } 00:14:51.236 ] 00:14:51.236 }' 00:14:51.236 13:33:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.236 13:33:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:51.803 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.803 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:52.060 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:52.060 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:52.319 [2024-07-15 13:33:31.681346] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.319 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.576 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.576 "name": "Existed_Raid", 00:14:52.576 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:52.576 "strip_size_kb": 64, 00:14:52.576 "state": "configuring", 00:14:52.576 "raid_level": "concat", 00:14:52.576 "superblock": true, 00:14:52.576 "num_base_bdevs": 3, 00:14:52.576 "num_base_bdevs_discovered": 1, 00:14:52.576 "num_base_bdevs_operational": 3, 00:14:52.576 "base_bdevs_list": [ 00:14:52.576 { 00:14:52.576 "name": null, 00:14:52.576 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:52.576 "is_configured": false, 00:14:52.576 "data_offset": 2048, 00:14:52.576 "data_size": 63488 00:14:52.576 }, 00:14:52.576 { 00:14:52.576 "name": null, 00:14:52.576 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:52.576 "is_configured": false, 00:14:52.576 "data_offset": 2048, 00:14:52.576 "data_size": 63488 00:14:52.576 }, 00:14:52.576 { 00:14:52.576 "name": "BaseBdev3", 00:14:52.576 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:52.576 "is_configured": true, 00:14:52.576 "data_offset": 2048, 00:14:52.576 "data_size": 63488 00:14:52.576 } 00:14:52.576 ] 00:14:52.576 }' 00:14:52.576 13:33:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.576 13:33:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:53.141 13:33:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:53.141 13:33:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.399 13:33:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:53.399 13:33:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:53.658 [2024-07-15 13:33:33.013156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.658 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.916 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.916 "name": "Existed_Raid", 00:14:53.916 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:53.916 "strip_size_kb": 64, 00:14:53.916 "state": "configuring", 00:14:53.916 "raid_level": "concat", 00:14:53.916 "superblock": true, 00:14:53.916 "num_base_bdevs": 3, 00:14:53.916 "num_base_bdevs_discovered": 2, 00:14:53.916 "num_base_bdevs_operational": 3, 00:14:53.916 "base_bdevs_list": [ 00:14:53.916 { 00:14:53.916 "name": null, 00:14:53.916 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:53.916 "is_configured": false, 00:14:53.916 "data_offset": 2048, 00:14:53.916 "data_size": 63488 00:14:53.916 }, 00:14:53.916 { 00:14:53.916 "name": "BaseBdev2", 00:14:53.916 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:53.916 "is_configured": true, 00:14:53.916 "data_offset": 2048, 00:14:53.916 "data_size": 63488 00:14:53.916 }, 00:14:53.916 { 00:14:53.916 "name": "BaseBdev3", 00:14:53.916 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:53.916 "is_configured": true, 00:14:53.916 "data_offset": 2048, 00:14:53.916 "data_size": 63488 00:14:53.916 } 00:14:53.916 ] 00:14:53.916 }' 00:14:53.916 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.916 13:33:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:54.519 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.519 13:33:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:54.778 13:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:54.779 13:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:54.779 13:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.037 13:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6df39694-9f11-4189-aff0-e936909e4798 00:14:55.296 [2024-07-15 13:33:34.605970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:55.296 [2024-07-15 13:33:34.606122] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17dbf50 00:14:55.296 [2024-07-15 13:33:34.606136] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:55.296 [2024-07-15 13:33:34.606313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14e2940 00:14:55.296 [2024-07-15 13:33:34.606426] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17dbf50 00:14:55.296 [2024-07-15 13:33:34.606436] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17dbf50 00:14:55.296 [2024-07-15 13:33:34.606526] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:55.296 NewBaseBdev 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:55.296 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:55.555 13:33:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:55.815 [ 00:14:55.815 { 00:14:55.815 "name": "NewBaseBdev", 00:14:55.815 "aliases": [ 00:14:55.815 "6df39694-9f11-4189-aff0-e936909e4798" 00:14:55.815 ], 00:14:55.815 "product_name": "Malloc disk", 00:14:55.815 "block_size": 512, 00:14:55.815 "num_blocks": 65536, 00:14:55.815 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:55.815 "assigned_rate_limits": { 00:14:55.815 "rw_ios_per_sec": 0, 00:14:55.815 "rw_mbytes_per_sec": 0, 00:14:55.815 "r_mbytes_per_sec": 0, 00:14:55.815 "w_mbytes_per_sec": 0 00:14:55.815 }, 00:14:55.815 "claimed": true, 00:14:55.815 "claim_type": "exclusive_write", 00:14:55.815 "zoned": false, 00:14:55.815 "supported_io_types": { 00:14:55.815 "read": true, 00:14:55.815 "write": true, 00:14:55.815 "unmap": true, 00:14:55.815 "flush": true, 00:14:55.815 "reset": true, 00:14:55.815 "nvme_admin": false, 00:14:55.815 "nvme_io": false, 00:14:55.815 "nvme_io_md": false, 00:14:55.815 "write_zeroes": true, 00:14:55.815 "zcopy": true, 00:14:55.815 "get_zone_info": false, 00:14:55.815 "zone_management": false, 00:14:55.815 "zone_append": false, 00:14:55.815 "compare": false, 00:14:55.815 "compare_and_write": false, 00:14:55.815 "abort": true, 00:14:55.815 "seek_hole": false, 00:14:55.815 "seek_data": false, 00:14:55.815 "copy": true, 00:14:55.815 "nvme_iov_md": false 00:14:55.815 }, 00:14:55.815 "memory_domains": [ 00:14:55.815 { 00:14:55.815 "dma_device_id": "system", 00:14:55.815 "dma_device_type": 1 00:14:55.815 }, 00:14:55.815 { 00:14:55.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.815 "dma_device_type": 2 00:14:55.815 } 00:14:55.815 ], 00:14:55.815 "driver_specific": {} 00:14:55.815 } 00:14:55.815 ] 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.815 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.074 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.074 "name": "Existed_Raid", 00:14:56.074 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:56.074 "strip_size_kb": 64, 00:14:56.074 "state": "online", 00:14:56.074 "raid_level": "concat", 00:14:56.074 "superblock": true, 00:14:56.074 "num_base_bdevs": 3, 00:14:56.074 "num_base_bdevs_discovered": 3, 00:14:56.074 "num_base_bdevs_operational": 3, 00:14:56.074 "base_bdevs_list": [ 00:14:56.074 { 00:14:56.074 "name": "NewBaseBdev", 00:14:56.074 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:56.074 "is_configured": true, 00:14:56.074 "data_offset": 2048, 00:14:56.074 "data_size": 63488 00:14:56.074 }, 00:14:56.074 { 00:14:56.074 "name": "BaseBdev2", 00:14:56.074 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:56.074 "is_configured": true, 00:14:56.074 "data_offset": 2048, 00:14:56.074 "data_size": 63488 00:14:56.074 }, 00:14:56.074 { 00:14:56.074 "name": "BaseBdev3", 00:14:56.074 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:56.074 "is_configured": true, 00:14:56.074 "data_offset": 2048, 00:14:56.074 "data_size": 63488 00:14:56.074 } 00:14:56.074 ] 00:14:56.074 }' 00:14:56.074 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.074 13:33:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:56.642 13:33:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:56.901 [2024-07-15 13:33:36.174425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:56.901 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:56.901 "name": "Existed_Raid", 00:14:56.901 "aliases": [ 00:14:56.901 "526ce33b-feec-4d39-962d-daaa1b4ebd5f" 00:14:56.901 ], 00:14:56.901 "product_name": "Raid Volume", 00:14:56.901 "block_size": 512, 00:14:56.901 "num_blocks": 190464, 00:14:56.901 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:56.901 "assigned_rate_limits": { 00:14:56.901 "rw_ios_per_sec": 0, 00:14:56.901 "rw_mbytes_per_sec": 0, 00:14:56.901 "r_mbytes_per_sec": 0, 00:14:56.901 "w_mbytes_per_sec": 0 00:14:56.901 }, 00:14:56.901 "claimed": false, 00:14:56.901 "zoned": false, 00:14:56.901 "supported_io_types": { 00:14:56.901 "read": true, 00:14:56.901 "write": true, 00:14:56.901 "unmap": true, 00:14:56.901 "flush": true, 00:14:56.901 "reset": true, 00:14:56.901 "nvme_admin": false, 00:14:56.901 "nvme_io": false, 00:14:56.901 "nvme_io_md": false, 00:14:56.901 "write_zeroes": true, 00:14:56.901 "zcopy": false, 00:14:56.901 "get_zone_info": false, 00:14:56.901 "zone_management": false, 00:14:56.901 "zone_append": false, 00:14:56.901 "compare": false, 00:14:56.901 "compare_and_write": false, 00:14:56.901 "abort": false, 00:14:56.902 "seek_hole": false, 00:14:56.902 "seek_data": false, 00:14:56.902 "copy": false, 00:14:56.902 "nvme_iov_md": false 00:14:56.902 }, 00:14:56.902 "memory_domains": [ 00:14:56.902 { 00:14:56.902 "dma_device_id": "system", 00:14:56.902 "dma_device_type": 1 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.902 "dma_device_type": 2 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "dma_device_id": "system", 00:14:56.902 "dma_device_type": 1 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.902 "dma_device_type": 2 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "dma_device_id": "system", 00:14:56.902 "dma_device_type": 1 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.902 "dma_device_type": 2 00:14:56.902 } 00:14:56.902 ], 00:14:56.902 "driver_specific": { 00:14:56.902 "raid": { 00:14:56.902 "uuid": "526ce33b-feec-4d39-962d-daaa1b4ebd5f", 00:14:56.902 "strip_size_kb": 64, 00:14:56.902 "state": "online", 00:14:56.902 "raid_level": "concat", 00:14:56.902 "superblock": true, 00:14:56.902 "num_base_bdevs": 3, 00:14:56.902 "num_base_bdevs_discovered": 3, 00:14:56.902 "num_base_bdevs_operational": 3, 00:14:56.902 "base_bdevs_list": [ 00:14:56.902 { 00:14:56.902 "name": "NewBaseBdev", 00:14:56.902 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:56.902 "is_configured": true, 00:14:56.902 "data_offset": 2048, 00:14:56.902 "data_size": 63488 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "name": "BaseBdev2", 00:14:56.902 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:56.902 "is_configured": true, 00:14:56.902 "data_offset": 2048, 00:14:56.902 "data_size": 63488 00:14:56.902 }, 00:14:56.902 { 00:14:56.902 "name": "BaseBdev3", 00:14:56.902 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:56.902 "is_configured": true, 00:14:56.902 "data_offset": 2048, 00:14:56.902 "data_size": 63488 00:14:56.902 } 00:14:56.902 ] 00:14:56.902 } 00:14:56.902 } 00:14:56.902 }' 00:14:56.902 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:56.902 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:56.902 BaseBdev2 00:14:56.902 BaseBdev3' 00:14:56.902 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.902 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:56.902 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.161 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.161 "name": "NewBaseBdev", 00:14:57.161 "aliases": [ 00:14:57.161 "6df39694-9f11-4189-aff0-e936909e4798" 00:14:57.161 ], 00:14:57.161 "product_name": "Malloc disk", 00:14:57.161 "block_size": 512, 00:14:57.161 "num_blocks": 65536, 00:14:57.161 "uuid": "6df39694-9f11-4189-aff0-e936909e4798", 00:14:57.161 "assigned_rate_limits": { 00:14:57.161 "rw_ios_per_sec": 0, 00:14:57.161 "rw_mbytes_per_sec": 0, 00:14:57.161 "r_mbytes_per_sec": 0, 00:14:57.161 "w_mbytes_per_sec": 0 00:14:57.161 }, 00:14:57.161 "claimed": true, 00:14:57.161 "claim_type": "exclusive_write", 00:14:57.161 "zoned": false, 00:14:57.161 "supported_io_types": { 00:14:57.161 "read": true, 00:14:57.161 "write": true, 00:14:57.161 "unmap": true, 00:14:57.161 "flush": true, 00:14:57.161 "reset": true, 00:14:57.161 "nvme_admin": false, 00:14:57.161 "nvme_io": false, 00:14:57.161 "nvme_io_md": false, 00:14:57.161 "write_zeroes": true, 00:14:57.162 "zcopy": true, 00:14:57.162 "get_zone_info": false, 00:14:57.162 "zone_management": false, 00:14:57.162 "zone_append": false, 00:14:57.162 "compare": false, 00:14:57.162 "compare_and_write": false, 00:14:57.162 "abort": true, 00:14:57.162 "seek_hole": false, 00:14:57.162 "seek_data": false, 00:14:57.162 "copy": true, 00:14:57.162 "nvme_iov_md": false 00:14:57.162 }, 00:14:57.162 "memory_domains": [ 00:14:57.162 { 00:14:57.162 "dma_device_id": "system", 00:14:57.162 "dma_device_type": 1 00:14:57.162 }, 00:14:57.162 { 00:14:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.162 "dma_device_type": 2 00:14:57.162 } 00:14:57.162 ], 00:14:57.162 "driver_specific": {} 00:14:57.162 }' 00:14:57.162 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.162 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.162 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.162 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.420 13:33:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:57.679 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.679 "name": "BaseBdev2", 00:14:57.679 "aliases": [ 00:14:57.679 "dcd2d044-2c04-49d7-99e3-d82b0686927f" 00:14:57.679 ], 00:14:57.679 "product_name": "Malloc disk", 00:14:57.679 "block_size": 512, 00:14:57.679 "num_blocks": 65536, 00:14:57.679 "uuid": "dcd2d044-2c04-49d7-99e3-d82b0686927f", 00:14:57.679 "assigned_rate_limits": { 00:14:57.679 "rw_ios_per_sec": 0, 00:14:57.679 "rw_mbytes_per_sec": 0, 00:14:57.679 "r_mbytes_per_sec": 0, 00:14:57.679 "w_mbytes_per_sec": 0 00:14:57.679 }, 00:14:57.679 "claimed": true, 00:14:57.679 "claim_type": "exclusive_write", 00:14:57.679 "zoned": false, 00:14:57.679 "supported_io_types": { 00:14:57.679 "read": true, 00:14:57.679 "write": true, 00:14:57.679 "unmap": true, 00:14:57.679 "flush": true, 00:14:57.679 "reset": true, 00:14:57.679 "nvme_admin": false, 00:14:57.679 "nvme_io": false, 00:14:57.679 "nvme_io_md": false, 00:14:57.679 "write_zeroes": true, 00:14:57.679 "zcopy": true, 00:14:57.679 "get_zone_info": false, 00:14:57.679 "zone_management": false, 00:14:57.679 "zone_append": false, 00:14:57.679 "compare": false, 00:14:57.679 "compare_and_write": false, 00:14:57.679 "abort": true, 00:14:57.679 "seek_hole": false, 00:14:57.679 "seek_data": false, 00:14:57.679 "copy": true, 00:14:57.679 "nvme_iov_md": false 00:14:57.679 }, 00:14:57.679 "memory_domains": [ 00:14:57.679 { 00:14:57.679 "dma_device_id": "system", 00:14:57.679 "dma_device_type": 1 00:14:57.679 }, 00:14:57.679 { 00:14:57.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.679 "dma_device_type": 2 00:14:57.679 } 00:14:57.679 ], 00:14:57.679 "driver_specific": {} 00:14:57.679 }' 00:14:57.679 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.938 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.199 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.199 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.199 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.199 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:58.199 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.457 "name": "BaseBdev3", 00:14:58.457 "aliases": [ 00:14:58.457 "14701bfe-c617-4a95-9175-8b5ea649136c" 00:14:58.457 ], 00:14:58.457 "product_name": "Malloc disk", 00:14:58.457 "block_size": 512, 00:14:58.457 "num_blocks": 65536, 00:14:58.457 "uuid": "14701bfe-c617-4a95-9175-8b5ea649136c", 00:14:58.457 "assigned_rate_limits": { 00:14:58.457 "rw_ios_per_sec": 0, 00:14:58.457 "rw_mbytes_per_sec": 0, 00:14:58.457 "r_mbytes_per_sec": 0, 00:14:58.457 "w_mbytes_per_sec": 0 00:14:58.457 }, 00:14:58.457 "claimed": true, 00:14:58.457 "claim_type": "exclusive_write", 00:14:58.457 "zoned": false, 00:14:58.457 "supported_io_types": { 00:14:58.457 "read": true, 00:14:58.457 "write": true, 00:14:58.457 "unmap": true, 00:14:58.457 "flush": true, 00:14:58.457 "reset": true, 00:14:58.457 "nvme_admin": false, 00:14:58.457 "nvme_io": false, 00:14:58.457 "nvme_io_md": false, 00:14:58.457 "write_zeroes": true, 00:14:58.457 "zcopy": true, 00:14:58.457 "get_zone_info": false, 00:14:58.457 "zone_management": false, 00:14:58.457 "zone_append": false, 00:14:58.457 "compare": false, 00:14:58.457 "compare_and_write": false, 00:14:58.457 "abort": true, 00:14:58.457 "seek_hole": false, 00:14:58.457 "seek_data": false, 00:14:58.457 "copy": true, 00:14:58.457 "nvme_iov_md": false 00:14:58.457 }, 00:14:58.457 "memory_domains": [ 00:14:58.457 { 00:14:58.457 "dma_device_id": "system", 00:14:58.457 "dma_device_type": 1 00:14:58.457 }, 00:14:58.457 { 00:14:58.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.457 "dma_device_type": 2 00:14:58.457 } 00:14:58.457 ], 00:14:58.457 "driver_specific": {} 00:14:58.457 }' 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.457 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.715 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.715 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.715 13:33:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.715 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.715 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.715 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:58.974 [2024-07-15 13:33:38.275687] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:58.974 [2024-07-15 13:33:38.275715] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:58.974 [2024-07-15 13:33:38.275771] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:58.974 [2024-07-15 13:33:38.275824] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:58.974 [2024-07-15 13:33:38.275837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dbf50 name Existed_Raid, state offline 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2105177 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2105177 ']' 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2105177 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2105177 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2105177' 00:14:58.974 killing process with pid 2105177 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2105177 00:14:58.974 [2024-07-15 13:33:38.341626] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:58.974 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2105177 00:14:58.974 [2024-07-15 13:33:38.371923] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:59.232 13:33:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:59.232 00:14:59.232 real 0m28.149s 00:14:59.232 user 0m51.536s 00:14:59.232 sys 0m5.138s 00:14:59.232 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:59.232 13:33:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:59.232 ************************************ 00:14:59.232 END TEST raid_state_function_test_sb 00:14:59.232 ************************************ 00:14:59.232 13:33:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:59.232 13:33:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:59.232 13:33:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:59.232 13:33:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:59.232 13:33:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:59.232 ************************************ 00:14:59.232 START TEST raid_superblock_test 00:14:59.232 ************************************ 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2109474 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2109474 /var/tmp/spdk-raid.sock 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2109474 ']' 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:59.232 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:59.232 13:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.491 [2024-07-15 13:33:38.711214] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:14:59.491 [2024-07-15 13:33:38.711282] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2109474 ] 00:14:59.491 [2024-07-15 13:33:38.837580] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:59.749 [2024-07-15 13:33:38.941304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:59.749 [2024-07-15 13:33:39.009730] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:59.749 [2024-07-15 13:33:39.009775] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:00.314 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:00.573 malloc1 00:15:00.573 13:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:00.831 [2024-07-15 13:33:40.117426] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:00.831 [2024-07-15 13:33:40.117480] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.831 [2024-07-15 13:33:40.117503] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259c570 00:15:00.831 [2024-07-15 13:33:40.117516] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.831 [2024-07-15 13:33:40.119243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.831 [2024-07-15 13:33:40.119271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:00.831 pt1 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:00.831 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:01.089 malloc2 00:15:01.089 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:01.347 [2024-07-15 13:33:40.611750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:01.347 [2024-07-15 13:33:40.611798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.347 [2024-07-15 13:33:40.611816] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259d970 00:15:01.347 [2024-07-15 13:33:40.611830] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.347 [2024-07-15 13:33:40.613486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.347 [2024-07-15 13:33:40.613522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:01.347 pt2 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:01.347 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:01.605 malloc3 00:15:01.605 13:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:01.863 [2024-07-15 13:33:41.098947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:01.863 [2024-07-15 13:33:41.098996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.863 [2024-07-15 13:33:41.099014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2734340 00:15:01.863 [2024-07-15 13:33:41.099027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.863 [2024-07-15 13:33:41.100627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.863 [2024-07-15 13:33:41.100658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:01.863 pt3 00:15:01.863 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:01.863 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:01.863 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:02.122 [2024-07-15 13:33:41.343606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:02.122 [2024-07-15 13:33:41.344972] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:02.122 [2024-07-15 13:33:41.345030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:02.122 [2024-07-15 13:33:41.345178] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2594ea0 00:15:02.122 [2024-07-15 13:33:41.345190] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:02.122 [2024-07-15 13:33:41.345398] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259c240 00:15:02.122 [2024-07-15 13:33:41.345546] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2594ea0 00:15:02.122 [2024-07-15 13:33:41.345556] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2594ea0 00:15:02.122 [2024-07-15 13:33:41.345656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.122 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:02.381 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.381 "name": "raid_bdev1", 00:15:02.381 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:02.381 "strip_size_kb": 64, 00:15:02.381 "state": "online", 00:15:02.381 "raid_level": "concat", 00:15:02.381 "superblock": true, 00:15:02.381 "num_base_bdevs": 3, 00:15:02.381 "num_base_bdevs_discovered": 3, 00:15:02.381 "num_base_bdevs_operational": 3, 00:15:02.381 "base_bdevs_list": [ 00:15:02.381 { 00:15:02.381 "name": "pt1", 00:15:02.381 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:02.381 "is_configured": true, 00:15:02.381 "data_offset": 2048, 00:15:02.381 "data_size": 63488 00:15:02.381 }, 00:15:02.381 { 00:15:02.381 "name": "pt2", 00:15:02.381 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:02.381 "is_configured": true, 00:15:02.381 "data_offset": 2048, 00:15:02.381 "data_size": 63488 00:15:02.381 }, 00:15:02.381 { 00:15:02.381 "name": "pt3", 00:15:02.381 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:02.381 "is_configured": true, 00:15:02.381 "data_offset": 2048, 00:15:02.381 "data_size": 63488 00:15:02.381 } 00:15:02.381 ] 00:15:02.381 }' 00:15:02.381 13:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.381 13:33:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:02.949 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:02.949 [2024-07-15 13:33:42.354511] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:03.208 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:03.208 "name": "raid_bdev1", 00:15:03.208 "aliases": [ 00:15:03.208 "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28" 00:15:03.208 ], 00:15:03.208 "product_name": "Raid Volume", 00:15:03.208 "block_size": 512, 00:15:03.208 "num_blocks": 190464, 00:15:03.208 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:03.208 "assigned_rate_limits": { 00:15:03.208 "rw_ios_per_sec": 0, 00:15:03.208 "rw_mbytes_per_sec": 0, 00:15:03.208 "r_mbytes_per_sec": 0, 00:15:03.208 "w_mbytes_per_sec": 0 00:15:03.208 }, 00:15:03.208 "claimed": false, 00:15:03.208 "zoned": false, 00:15:03.208 "supported_io_types": { 00:15:03.208 "read": true, 00:15:03.208 "write": true, 00:15:03.208 "unmap": true, 00:15:03.208 "flush": true, 00:15:03.208 "reset": true, 00:15:03.208 "nvme_admin": false, 00:15:03.208 "nvme_io": false, 00:15:03.208 "nvme_io_md": false, 00:15:03.208 "write_zeroes": true, 00:15:03.208 "zcopy": false, 00:15:03.208 "get_zone_info": false, 00:15:03.208 "zone_management": false, 00:15:03.208 "zone_append": false, 00:15:03.208 "compare": false, 00:15:03.208 "compare_and_write": false, 00:15:03.208 "abort": false, 00:15:03.208 "seek_hole": false, 00:15:03.208 "seek_data": false, 00:15:03.208 "copy": false, 00:15:03.208 "nvme_iov_md": false 00:15:03.208 }, 00:15:03.208 "memory_domains": [ 00:15:03.208 { 00:15:03.208 "dma_device_id": "system", 00:15:03.208 "dma_device_type": 1 00:15:03.208 }, 00:15:03.208 { 00:15:03.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.208 "dma_device_type": 2 00:15:03.208 }, 00:15:03.208 { 00:15:03.208 "dma_device_id": "system", 00:15:03.209 "dma_device_type": 1 00:15:03.209 }, 00:15:03.209 { 00:15:03.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.209 "dma_device_type": 2 00:15:03.209 }, 00:15:03.209 { 00:15:03.209 "dma_device_id": "system", 00:15:03.209 "dma_device_type": 1 00:15:03.209 }, 00:15:03.209 { 00:15:03.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.209 "dma_device_type": 2 00:15:03.209 } 00:15:03.209 ], 00:15:03.209 "driver_specific": { 00:15:03.209 "raid": { 00:15:03.209 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:03.209 "strip_size_kb": 64, 00:15:03.209 "state": "online", 00:15:03.209 "raid_level": "concat", 00:15:03.209 "superblock": true, 00:15:03.209 "num_base_bdevs": 3, 00:15:03.209 "num_base_bdevs_discovered": 3, 00:15:03.209 "num_base_bdevs_operational": 3, 00:15:03.209 "base_bdevs_list": [ 00:15:03.209 { 00:15:03.209 "name": "pt1", 00:15:03.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.209 "is_configured": true, 00:15:03.209 "data_offset": 2048, 00:15:03.209 "data_size": 63488 00:15:03.209 }, 00:15:03.209 { 00:15:03.209 "name": "pt2", 00:15:03.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.209 "is_configured": true, 00:15:03.209 "data_offset": 2048, 00:15:03.209 "data_size": 63488 00:15:03.209 }, 00:15:03.209 { 00:15:03.209 "name": "pt3", 00:15:03.209 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.209 "is_configured": true, 00:15:03.209 "data_offset": 2048, 00:15:03.209 "data_size": 63488 00:15:03.209 } 00:15:03.209 ] 00:15:03.209 } 00:15:03.209 } 00:15:03.209 }' 00:15:03.209 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:03.209 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:03.209 pt2 00:15:03.209 pt3' 00:15:03.209 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.209 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:03.209 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.468 "name": "pt1", 00:15:03.468 "aliases": [ 00:15:03.468 "00000000-0000-0000-0000-000000000001" 00:15:03.468 ], 00:15:03.468 "product_name": "passthru", 00:15:03.468 "block_size": 512, 00:15:03.468 "num_blocks": 65536, 00:15:03.468 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.468 "assigned_rate_limits": { 00:15:03.468 "rw_ios_per_sec": 0, 00:15:03.468 "rw_mbytes_per_sec": 0, 00:15:03.468 "r_mbytes_per_sec": 0, 00:15:03.468 "w_mbytes_per_sec": 0 00:15:03.468 }, 00:15:03.468 "claimed": true, 00:15:03.468 "claim_type": "exclusive_write", 00:15:03.468 "zoned": false, 00:15:03.468 "supported_io_types": { 00:15:03.468 "read": true, 00:15:03.468 "write": true, 00:15:03.468 "unmap": true, 00:15:03.468 "flush": true, 00:15:03.468 "reset": true, 00:15:03.468 "nvme_admin": false, 00:15:03.468 "nvme_io": false, 00:15:03.468 "nvme_io_md": false, 00:15:03.468 "write_zeroes": true, 00:15:03.468 "zcopy": true, 00:15:03.468 "get_zone_info": false, 00:15:03.468 "zone_management": false, 00:15:03.468 "zone_append": false, 00:15:03.468 "compare": false, 00:15:03.468 "compare_and_write": false, 00:15:03.468 "abort": true, 00:15:03.468 "seek_hole": false, 00:15:03.468 "seek_data": false, 00:15:03.468 "copy": true, 00:15:03.468 "nvme_iov_md": false 00:15:03.468 }, 00:15:03.468 "memory_domains": [ 00:15:03.468 { 00:15:03.468 "dma_device_id": "system", 00:15:03.468 "dma_device_type": 1 00:15:03.468 }, 00:15:03.468 { 00:15:03.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.468 "dma_device_type": 2 00:15:03.468 } 00:15:03.468 ], 00:15:03.468 "driver_specific": { 00:15:03.468 "passthru": { 00:15:03.468 "name": "pt1", 00:15:03.468 "base_bdev_name": "malloc1" 00:15:03.468 } 00:15:03.468 } 00:15:03.468 }' 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.468 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.728 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.728 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.728 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.728 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:03.728 13:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.997 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.997 "name": "pt2", 00:15:03.997 "aliases": [ 00:15:03.997 "00000000-0000-0000-0000-000000000002" 00:15:03.997 ], 00:15:03.997 "product_name": "passthru", 00:15:03.997 "block_size": 512, 00:15:03.997 "num_blocks": 65536, 00:15:03.997 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.997 "assigned_rate_limits": { 00:15:03.997 "rw_ios_per_sec": 0, 00:15:03.997 "rw_mbytes_per_sec": 0, 00:15:03.997 "r_mbytes_per_sec": 0, 00:15:03.997 "w_mbytes_per_sec": 0 00:15:03.997 }, 00:15:03.997 "claimed": true, 00:15:03.997 "claim_type": "exclusive_write", 00:15:03.997 "zoned": false, 00:15:03.997 "supported_io_types": { 00:15:03.997 "read": true, 00:15:03.997 "write": true, 00:15:03.997 "unmap": true, 00:15:03.997 "flush": true, 00:15:03.997 "reset": true, 00:15:03.997 "nvme_admin": false, 00:15:03.997 "nvme_io": false, 00:15:03.997 "nvme_io_md": false, 00:15:03.997 "write_zeroes": true, 00:15:03.997 "zcopy": true, 00:15:03.997 "get_zone_info": false, 00:15:03.998 "zone_management": false, 00:15:03.998 "zone_append": false, 00:15:03.998 "compare": false, 00:15:03.998 "compare_and_write": false, 00:15:03.998 "abort": true, 00:15:03.998 "seek_hole": false, 00:15:03.998 "seek_data": false, 00:15:03.998 "copy": true, 00:15:03.998 "nvme_iov_md": false 00:15:03.998 }, 00:15:03.998 "memory_domains": [ 00:15:03.998 { 00:15:03.998 "dma_device_id": "system", 00:15:03.998 "dma_device_type": 1 00:15:03.998 }, 00:15:03.998 { 00:15:03.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.998 "dma_device_type": 2 00:15:03.998 } 00:15:03.998 ], 00:15:03.998 "driver_specific": { 00:15:03.998 "passthru": { 00:15:03.998 "name": "pt2", 00:15:03.998 "base_bdev_name": "malloc2" 00:15:03.998 } 00:15:03.998 } 00:15:03.998 }' 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.998 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.256 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.256 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.256 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.257 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.257 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.257 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.257 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:04.257 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.515 "name": "pt3", 00:15:04.515 "aliases": [ 00:15:04.515 "00000000-0000-0000-0000-000000000003" 00:15:04.515 ], 00:15:04.515 "product_name": "passthru", 00:15:04.515 "block_size": 512, 00:15:04.515 "num_blocks": 65536, 00:15:04.515 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:04.515 "assigned_rate_limits": { 00:15:04.515 "rw_ios_per_sec": 0, 00:15:04.515 "rw_mbytes_per_sec": 0, 00:15:04.515 "r_mbytes_per_sec": 0, 00:15:04.515 "w_mbytes_per_sec": 0 00:15:04.515 }, 00:15:04.515 "claimed": true, 00:15:04.515 "claim_type": "exclusive_write", 00:15:04.515 "zoned": false, 00:15:04.515 "supported_io_types": { 00:15:04.515 "read": true, 00:15:04.515 "write": true, 00:15:04.515 "unmap": true, 00:15:04.515 "flush": true, 00:15:04.515 "reset": true, 00:15:04.515 "nvme_admin": false, 00:15:04.515 "nvme_io": false, 00:15:04.515 "nvme_io_md": false, 00:15:04.515 "write_zeroes": true, 00:15:04.515 "zcopy": true, 00:15:04.515 "get_zone_info": false, 00:15:04.515 "zone_management": false, 00:15:04.515 "zone_append": false, 00:15:04.515 "compare": false, 00:15:04.515 "compare_and_write": false, 00:15:04.515 "abort": true, 00:15:04.515 "seek_hole": false, 00:15:04.515 "seek_data": false, 00:15:04.515 "copy": true, 00:15:04.515 "nvme_iov_md": false 00:15:04.515 }, 00:15:04.515 "memory_domains": [ 00:15:04.515 { 00:15:04.515 "dma_device_id": "system", 00:15:04.515 "dma_device_type": 1 00:15:04.515 }, 00:15:04.515 { 00:15:04.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.515 "dma_device_type": 2 00:15:04.515 } 00:15:04.515 ], 00:15:04.515 "driver_specific": { 00:15:04.515 "passthru": { 00:15:04.515 "name": "pt3", 00:15:04.515 "base_bdev_name": "malloc3" 00:15:04.515 } 00:15:04.515 } 00:15:04.515 }' 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.515 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.774 13:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:04.774 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:05.033 [2024-07-15 13:33:44.311714] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:05.033 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bd4952a8-77c7-4ba4-8c80-b0cebc7fff28 00:15:05.033 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z bd4952a8-77c7-4ba4-8c80-b0cebc7fff28 ']' 00:15:05.033 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:05.292 [2024-07-15 13:33:44.479884] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:05.292 [2024-07-15 13:33:44.479904] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:05.292 [2024-07-15 13:33:44.479960] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:05.292 [2024-07-15 13:33:44.480015] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:05.292 [2024-07-15 13:33:44.480027] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2594ea0 name raid_bdev1, state offline 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:05.292 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:05.551 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:05.551 13:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:05.810 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:05.810 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:06.069 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:06.069 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:06.637 13:33:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:06.637 [2024-07-15 13:33:46.023900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:06.637 [2024-07-15 13:33:46.025280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:06.637 [2024-07-15 13:33:46.025323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:06.637 [2024-07-15 13:33:46.025367] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:06.637 [2024-07-15 13:33:46.025406] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:06.637 [2024-07-15 13:33:46.025430] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:06.637 [2024-07-15 13:33:46.025449] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:06.637 [2024-07-15 13:33:46.025459] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x273fff0 name raid_bdev1, state configuring 00:15:06.637 request: 00:15:06.637 { 00:15:06.637 "name": "raid_bdev1", 00:15:06.637 "raid_level": "concat", 00:15:06.637 "base_bdevs": [ 00:15:06.637 "malloc1", 00:15:06.637 "malloc2", 00:15:06.637 "malloc3" 00:15:06.637 ], 00:15:06.637 "strip_size_kb": 64, 00:15:06.637 "superblock": false, 00:15:06.637 "method": "bdev_raid_create", 00:15:06.637 "req_id": 1 00:15:06.637 } 00:15:06.637 Got JSON-RPC error response 00:15:06.637 response: 00:15:06.637 { 00:15:06.637 "code": -17, 00:15:06.637 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:06.637 } 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:06.637 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.897 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:06.897 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:06.897 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:07.156 [2024-07-15 13:33:46.501117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:07.156 [2024-07-15 13:33:46.501165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.156 [2024-07-15 13:33:46.501186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259c7a0 00:15:07.156 [2024-07-15 13:33:46.501199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.156 [2024-07-15 13:33:46.502827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.156 [2024-07-15 13:33:46.502855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:07.156 [2024-07-15 13:33:46.502919] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:07.156 [2024-07-15 13:33:46.502956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:07.156 pt1 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.156 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.414 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.414 "name": "raid_bdev1", 00:15:07.414 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:07.414 "strip_size_kb": 64, 00:15:07.414 "state": "configuring", 00:15:07.414 "raid_level": "concat", 00:15:07.414 "superblock": true, 00:15:07.414 "num_base_bdevs": 3, 00:15:07.414 "num_base_bdevs_discovered": 1, 00:15:07.414 "num_base_bdevs_operational": 3, 00:15:07.414 "base_bdevs_list": [ 00:15:07.414 { 00:15:07.414 "name": "pt1", 00:15:07.414 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:07.414 "is_configured": true, 00:15:07.414 "data_offset": 2048, 00:15:07.414 "data_size": 63488 00:15:07.414 }, 00:15:07.414 { 00:15:07.414 "name": null, 00:15:07.414 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:07.414 "is_configured": false, 00:15:07.414 "data_offset": 2048, 00:15:07.414 "data_size": 63488 00:15:07.414 }, 00:15:07.414 { 00:15:07.414 "name": null, 00:15:07.414 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:07.414 "is_configured": false, 00:15:07.414 "data_offset": 2048, 00:15:07.414 "data_size": 63488 00:15:07.414 } 00:15:07.414 ] 00:15:07.414 }' 00:15:07.414 13:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.414 13:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.347 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:08.347 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:08.640 [2024-07-15 13:33:47.860727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:08.640 [2024-07-15 13:33:47.860786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.640 [2024-07-15 13:33:47.860807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2593c70 00:15:08.640 [2024-07-15 13:33:47.860820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.640 [2024-07-15 13:33:47.861178] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.640 [2024-07-15 13:33:47.861198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:08.640 [2024-07-15 13:33:47.861263] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:08.640 [2024-07-15 13:33:47.861282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:08.640 pt2 00:15:08.640 13:33:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:08.899 [2024-07-15 13:33:48.109396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.899 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:09.158 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.158 "name": "raid_bdev1", 00:15:09.158 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:09.158 "strip_size_kb": 64, 00:15:09.158 "state": "configuring", 00:15:09.158 "raid_level": "concat", 00:15:09.158 "superblock": true, 00:15:09.158 "num_base_bdevs": 3, 00:15:09.158 "num_base_bdevs_discovered": 1, 00:15:09.158 "num_base_bdevs_operational": 3, 00:15:09.158 "base_bdevs_list": [ 00:15:09.158 { 00:15:09.158 "name": "pt1", 00:15:09.158 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:09.158 "is_configured": true, 00:15:09.158 "data_offset": 2048, 00:15:09.158 "data_size": 63488 00:15:09.158 }, 00:15:09.158 { 00:15:09.158 "name": null, 00:15:09.158 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:09.158 "is_configured": false, 00:15:09.158 "data_offset": 2048, 00:15:09.158 "data_size": 63488 00:15:09.158 }, 00:15:09.158 { 00:15:09.158 "name": null, 00:15:09.158 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:09.158 "is_configured": false, 00:15:09.158 "data_offset": 2048, 00:15:09.158 "data_size": 63488 00:15:09.158 } 00:15:09.158 ] 00:15:09.158 }' 00:15:09.158 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.158 13:33:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.725 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:09.725 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:09.725 13:33:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:09.983 [2024-07-15 13:33:49.196286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:09.983 [2024-07-15 13:33:49.196335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.983 [2024-07-15 13:33:49.196357] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x259ca10 00:15:09.983 [2024-07-15 13:33:49.196376] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.983 [2024-07-15 13:33:49.196703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.983 [2024-07-15 13:33:49.196723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:09.983 [2024-07-15 13:33:49.196781] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:09.983 [2024-07-15 13:33:49.196801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:09.983 pt2 00:15:09.983 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:09.983 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:09.983 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:10.241 [2024-07-15 13:33:49.444952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:10.241 [2024-07-15 13:33:49.444983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:10.241 [2024-07-15 13:33:49.444998] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2736740 00:15:10.241 [2024-07-15 13:33:49.445011] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:10.241 [2024-07-15 13:33:49.445273] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:10.241 [2024-07-15 13:33:49.445291] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:10.241 [2024-07-15 13:33:49.445337] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:10.241 [2024-07-15 13:33:49.445353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:10.241 [2024-07-15 13:33:49.445453] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2736c00 00:15:10.241 [2024-07-15 13:33:49.445464] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:10.241 [2024-07-15 13:33:49.445626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259ba40 00:15:10.241 [2024-07-15 13:33:49.445749] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2736c00 00:15:10.241 [2024-07-15 13:33:49.445759] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2736c00 00:15:10.241 [2024-07-15 13:33:49.445849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:10.241 pt3 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.241 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:10.499 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.499 "name": "raid_bdev1", 00:15:10.499 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:10.499 "strip_size_kb": 64, 00:15:10.499 "state": "online", 00:15:10.499 "raid_level": "concat", 00:15:10.499 "superblock": true, 00:15:10.499 "num_base_bdevs": 3, 00:15:10.499 "num_base_bdevs_discovered": 3, 00:15:10.499 "num_base_bdevs_operational": 3, 00:15:10.499 "base_bdevs_list": [ 00:15:10.499 { 00:15:10.499 "name": "pt1", 00:15:10.499 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:10.499 "is_configured": true, 00:15:10.499 "data_offset": 2048, 00:15:10.499 "data_size": 63488 00:15:10.499 }, 00:15:10.499 { 00:15:10.499 "name": "pt2", 00:15:10.499 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:10.499 "is_configured": true, 00:15:10.499 "data_offset": 2048, 00:15:10.499 "data_size": 63488 00:15:10.499 }, 00:15:10.499 { 00:15:10.499 "name": "pt3", 00:15:10.499 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:10.499 "is_configured": true, 00:15:10.499 "data_offset": 2048, 00:15:10.499 "data_size": 63488 00:15:10.499 } 00:15:10.499 ] 00:15:10.499 }' 00:15:10.499 13:33:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.499 13:33:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:11.063 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:11.321 [2024-07-15 13:33:50.532122] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:11.321 "name": "raid_bdev1", 00:15:11.321 "aliases": [ 00:15:11.321 "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28" 00:15:11.321 ], 00:15:11.321 "product_name": "Raid Volume", 00:15:11.321 "block_size": 512, 00:15:11.321 "num_blocks": 190464, 00:15:11.321 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:11.321 "assigned_rate_limits": { 00:15:11.321 "rw_ios_per_sec": 0, 00:15:11.321 "rw_mbytes_per_sec": 0, 00:15:11.321 "r_mbytes_per_sec": 0, 00:15:11.321 "w_mbytes_per_sec": 0 00:15:11.321 }, 00:15:11.321 "claimed": false, 00:15:11.321 "zoned": false, 00:15:11.321 "supported_io_types": { 00:15:11.321 "read": true, 00:15:11.321 "write": true, 00:15:11.321 "unmap": true, 00:15:11.321 "flush": true, 00:15:11.321 "reset": true, 00:15:11.321 "nvme_admin": false, 00:15:11.321 "nvme_io": false, 00:15:11.321 "nvme_io_md": false, 00:15:11.321 "write_zeroes": true, 00:15:11.321 "zcopy": false, 00:15:11.321 "get_zone_info": false, 00:15:11.321 "zone_management": false, 00:15:11.321 "zone_append": false, 00:15:11.321 "compare": false, 00:15:11.321 "compare_and_write": false, 00:15:11.321 "abort": false, 00:15:11.321 "seek_hole": false, 00:15:11.321 "seek_data": false, 00:15:11.321 "copy": false, 00:15:11.321 "nvme_iov_md": false 00:15:11.321 }, 00:15:11.321 "memory_domains": [ 00:15:11.321 { 00:15:11.321 "dma_device_id": "system", 00:15:11.321 "dma_device_type": 1 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.321 "dma_device_type": 2 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "dma_device_id": "system", 00:15:11.321 "dma_device_type": 1 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.321 "dma_device_type": 2 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "dma_device_id": "system", 00:15:11.321 "dma_device_type": 1 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.321 "dma_device_type": 2 00:15:11.321 } 00:15:11.321 ], 00:15:11.321 "driver_specific": { 00:15:11.321 "raid": { 00:15:11.321 "uuid": "bd4952a8-77c7-4ba4-8c80-b0cebc7fff28", 00:15:11.321 "strip_size_kb": 64, 00:15:11.321 "state": "online", 00:15:11.321 "raid_level": "concat", 00:15:11.321 "superblock": true, 00:15:11.321 "num_base_bdevs": 3, 00:15:11.321 "num_base_bdevs_discovered": 3, 00:15:11.321 "num_base_bdevs_operational": 3, 00:15:11.321 "base_bdevs_list": [ 00:15:11.321 { 00:15:11.321 "name": "pt1", 00:15:11.321 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:11.321 "is_configured": true, 00:15:11.321 "data_offset": 2048, 00:15:11.321 "data_size": 63488 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "name": "pt2", 00:15:11.321 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:11.321 "is_configured": true, 00:15:11.321 "data_offset": 2048, 00:15:11.321 "data_size": 63488 00:15:11.321 }, 00:15:11.321 { 00:15:11.321 "name": "pt3", 00:15:11.321 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:11.321 "is_configured": true, 00:15:11.321 "data_offset": 2048, 00:15:11.321 "data_size": 63488 00:15:11.321 } 00:15:11.321 ] 00:15:11.321 } 00:15:11.321 } 00:15:11.321 }' 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:11.321 pt2 00:15:11.321 pt3' 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:11.321 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:11.579 "name": "pt1", 00:15:11.579 "aliases": [ 00:15:11.579 "00000000-0000-0000-0000-000000000001" 00:15:11.579 ], 00:15:11.579 "product_name": "passthru", 00:15:11.579 "block_size": 512, 00:15:11.579 "num_blocks": 65536, 00:15:11.579 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:11.579 "assigned_rate_limits": { 00:15:11.579 "rw_ios_per_sec": 0, 00:15:11.579 "rw_mbytes_per_sec": 0, 00:15:11.579 "r_mbytes_per_sec": 0, 00:15:11.579 "w_mbytes_per_sec": 0 00:15:11.579 }, 00:15:11.579 "claimed": true, 00:15:11.579 "claim_type": "exclusive_write", 00:15:11.579 "zoned": false, 00:15:11.579 "supported_io_types": { 00:15:11.579 "read": true, 00:15:11.579 "write": true, 00:15:11.579 "unmap": true, 00:15:11.579 "flush": true, 00:15:11.579 "reset": true, 00:15:11.579 "nvme_admin": false, 00:15:11.579 "nvme_io": false, 00:15:11.579 "nvme_io_md": false, 00:15:11.579 "write_zeroes": true, 00:15:11.579 "zcopy": true, 00:15:11.579 "get_zone_info": false, 00:15:11.579 "zone_management": false, 00:15:11.579 "zone_append": false, 00:15:11.579 "compare": false, 00:15:11.579 "compare_and_write": false, 00:15:11.579 "abort": true, 00:15:11.579 "seek_hole": false, 00:15:11.579 "seek_data": false, 00:15:11.579 "copy": true, 00:15:11.579 "nvme_iov_md": false 00:15:11.579 }, 00:15:11.579 "memory_domains": [ 00:15:11.579 { 00:15:11.579 "dma_device_id": "system", 00:15:11.579 "dma_device_type": 1 00:15:11.579 }, 00:15:11.579 { 00:15:11.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.579 "dma_device_type": 2 00:15:11.579 } 00:15:11.579 ], 00:15:11.579 "driver_specific": { 00:15:11.579 "passthru": { 00:15:11.579 "name": "pt1", 00:15:11.579 "base_bdev_name": "malloc1" 00:15:11.579 } 00:15:11.579 } 00:15:11.579 }' 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.579 13:33:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:11.836 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:11.836 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:11.837 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:12.094 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.094 "name": "pt2", 00:15:12.094 "aliases": [ 00:15:12.094 "00000000-0000-0000-0000-000000000002" 00:15:12.094 ], 00:15:12.094 "product_name": "passthru", 00:15:12.094 "block_size": 512, 00:15:12.094 "num_blocks": 65536, 00:15:12.094 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:12.094 "assigned_rate_limits": { 00:15:12.094 "rw_ios_per_sec": 0, 00:15:12.094 "rw_mbytes_per_sec": 0, 00:15:12.094 "r_mbytes_per_sec": 0, 00:15:12.094 "w_mbytes_per_sec": 0 00:15:12.094 }, 00:15:12.094 "claimed": true, 00:15:12.094 "claim_type": "exclusive_write", 00:15:12.094 "zoned": false, 00:15:12.094 "supported_io_types": { 00:15:12.094 "read": true, 00:15:12.094 "write": true, 00:15:12.094 "unmap": true, 00:15:12.094 "flush": true, 00:15:12.094 "reset": true, 00:15:12.094 "nvme_admin": false, 00:15:12.094 "nvme_io": false, 00:15:12.094 "nvme_io_md": false, 00:15:12.094 "write_zeroes": true, 00:15:12.094 "zcopy": true, 00:15:12.094 "get_zone_info": false, 00:15:12.094 "zone_management": false, 00:15:12.094 "zone_append": false, 00:15:12.094 "compare": false, 00:15:12.094 "compare_and_write": false, 00:15:12.094 "abort": true, 00:15:12.094 "seek_hole": false, 00:15:12.094 "seek_data": false, 00:15:12.094 "copy": true, 00:15:12.095 "nvme_iov_md": false 00:15:12.095 }, 00:15:12.095 "memory_domains": [ 00:15:12.095 { 00:15:12.095 "dma_device_id": "system", 00:15:12.095 "dma_device_type": 1 00:15:12.095 }, 00:15:12.095 { 00:15:12.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.095 "dma_device_type": 2 00:15:12.095 } 00:15:12.095 ], 00:15:12.095 "driver_specific": { 00:15:12.095 "passthru": { 00:15:12.095 "name": "pt2", 00:15:12.095 "base_bdev_name": "malloc2" 00:15:12.095 } 00:15:12.095 } 00:15:12.095 }' 00:15:12.095 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.095 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.095 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.095 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:12.352 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:12.610 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:12.610 "name": "pt3", 00:15:12.610 "aliases": [ 00:15:12.610 "00000000-0000-0000-0000-000000000003" 00:15:12.610 ], 00:15:12.610 "product_name": "passthru", 00:15:12.610 "block_size": 512, 00:15:12.610 "num_blocks": 65536, 00:15:12.610 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:12.610 "assigned_rate_limits": { 00:15:12.610 "rw_ios_per_sec": 0, 00:15:12.610 "rw_mbytes_per_sec": 0, 00:15:12.610 "r_mbytes_per_sec": 0, 00:15:12.610 "w_mbytes_per_sec": 0 00:15:12.610 }, 00:15:12.610 "claimed": true, 00:15:12.610 "claim_type": "exclusive_write", 00:15:12.610 "zoned": false, 00:15:12.610 "supported_io_types": { 00:15:12.610 "read": true, 00:15:12.610 "write": true, 00:15:12.610 "unmap": true, 00:15:12.610 "flush": true, 00:15:12.610 "reset": true, 00:15:12.610 "nvme_admin": false, 00:15:12.610 "nvme_io": false, 00:15:12.610 "nvme_io_md": false, 00:15:12.610 "write_zeroes": true, 00:15:12.610 "zcopy": true, 00:15:12.610 "get_zone_info": false, 00:15:12.610 "zone_management": false, 00:15:12.610 "zone_append": false, 00:15:12.610 "compare": false, 00:15:12.610 "compare_and_write": false, 00:15:12.610 "abort": true, 00:15:12.610 "seek_hole": false, 00:15:12.610 "seek_data": false, 00:15:12.610 "copy": true, 00:15:12.610 "nvme_iov_md": false 00:15:12.610 }, 00:15:12.610 "memory_domains": [ 00:15:12.610 { 00:15:12.610 "dma_device_id": "system", 00:15:12.610 "dma_device_type": 1 00:15:12.610 }, 00:15:12.610 { 00:15:12.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.610 "dma_device_type": 2 00:15:12.610 } 00:15:12.610 ], 00:15:12.610 "driver_specific": { 00:15:12.610 "passthru": { 00:15:12.610 "name": "pt3", 00:15:12.610 "base_bdev_name": "malloc3" 00:15:12.610 } 00:15:12.610 } 00:15:12.610 }' 00:15:12.610 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.610 13:33:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:12.867 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:13.125 [2024-07-15 13:33:52.525382] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' bd4952a8-77c7-4ba4-8c80-b0cebc7fff28 '!=' bd4952a8-77c7-4ba4-8c80-b0cebc7fff28 ']' 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2109474 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2109474 ']' 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2109474 00:15:13.125 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2109474 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2109474' 00:15:13.383 killing process with pid 2109474 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2109474 00:15:13.383 [2024-07-15 13:33:52.593315] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:13.383 [2024-07-15 13:33:52.593372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:13.383 [2024-07-15 13:33:52.593433] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:13.383 [2024-07-15 13:33:52.593449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2736c00 name raid_bdev1, state offline 00:15:13.383 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2109474 00:15:13.383 [2024-07-15 13:33:52.623389] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:13.642 13:33:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:13.642 00:15:13.642 real 0m14.194s 00:15:13.642 user 0m25.546s 00:15:13.642 sys 0m2.525s 00:15:13.642 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:13.642 13:33:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.642 ************************************ 00:15:13.642 END TEST raid_superblock_test 00:15:13.642 ************************************ 00:15:13.642 13:33:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:13.642 13:33:52 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:13.642 13:33:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:13.642 13:33:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:13.642 13:33:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:13.642 ************************************ 00:15:13.642 START TEST raid_read_error_test 00:15:13.642 ************************************ 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Ywu4E7CgCh 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2111529 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2111529 /var/tmp/spdk-raid.sock 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2111529 ']' 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:13.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:13.642 13:33:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.642 [2024-07-15 13:33:53.003004] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:15:13.642 [2024-07-15 13:33:53.003074] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2111529 ] 00:15:13.901 [2024-07-15 13:33:53.133369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:13.901 [2024-07-15 13:33:53.236009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:13.901 [2024-07-15 13:33:53.302775] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:13.901 [2024-07-15 13:33:53.302809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:14.833 13:33:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:14.833 13:33:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:14.833 13:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:14.833 13:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:14.833 BaseBdev1_malloc 00:15:14.833 13:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:15.091 true 00:15:15.091 13:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:15.349 [2024-07-15 13:33:54.651116] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:15.349 [2024-07-15 13:33:54.651162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.349 [2024-07-15 13:33:54.651185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12a40d0 00:15:15.349 [2024-07-15 13:33:54.651199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.349 [2024-07-15 13:33:54.653075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.349 [2024-07-15 13:33:54.653108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:15.349 BaseBdev1 00:15:15.349 13:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:15.349 13:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:15.607 BaseBdev2_malloc 00:15:15.607 13:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:15.865 true 00:15:15.865 13:33:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:16.123 [2024-07-15 13:33:55.385614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:16.123 [2024-07-15 13:33:55.385660] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:16.123 [2024-07-15 13:33:55.385681] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12a8910 00:15:16.123 [2024-07-15 13:33:55.385695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:16.123 [2024-07-15 13:33:55.387294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:16.123 [2024-07-15 13:33:55.387323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:16.123 BaseBdev2 00:15:16.123 13:33:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:16.123 13:33:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:16.382 BaseBdev3_malloc 00:15:16.382 13:33:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:16.639 true 00:15:16.639 13:33:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:16.897 [2024-07-15 13:33:56.125365] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:16.897 [2024-07-15 13:33:56.125413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:16.897 [2024-07-15 13:33:56.125438] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12aabd0 00:15:16.897 [2024-07-15 13:33:56.125452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:16.897 [2024-07-15 13:33:56.127064] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:16.897 [2024-07-15 13:33:56.127095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:16.897 BaseBdev3 00:15:16.897 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:17.155 [2024-07-15 13:33:56.358009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:17.155 [2024-07-15 13:33:56.359313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:17.155 [2024-07-15 13:33:56.359383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:17.155 [2024-07-15 13:33:56.359596] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ac280 00:15:17.155 [2024-07-15 13:33:56.359607] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:17.155 [2024-07-15 13:33:56.359800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12abe20 00:15:17.155 [2024-07-15 13:33:56.359956] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ac280 00:15:17.155 [2024-07-15 13:33:56.359968] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ac280 00:15:17.155 [2024-07-15 13:33:56.360073] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.155 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:17.413 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.413 "name": "raid_bdev1", 00:15:17.413 "uuid": "d43a31f7-4918-4af7-99a3-506529a55e1b", 00:15:17.413 "strip_size_kb": 64, 00:15:17.413 "state": "online", 00:15:17.413 "raid_level": "concat", 00:15:17.413 "superblock": true, 00:15:17.413 "num_base_bdevs": 3, 00:15:17.413 "num_base_bdevs_discovered": 3, 00:15:17.413 "num_base_bdevs_operational": 3, 00:15:17.413 "base_bdevs_list": [ 00:15:17.413 { 00:15:17.413 "name": "BaseBdev1", 00:15:17.413 "uuid": "f4525c81-036e-5c80-afc5-35427832d5f4", 00:15:17.413 "is_configured": true, 00:15:17.413 "data_offset": 2048, 00:15:17.413 "data_size": 63488 00:15:17.413 }, 00:15:17.413 { 00:15:17.413 "name": "BaseBdev2", 00:15:17.413 "uuid": "952adfd0-cf3c-5388-87a1-007e52f196e7", 00:15:17.413 "is_configured": true, 00:15:17.413 "data_offset": 2048, 00:15:17.413 "data_size": 63488 00:15:17.413 }, 00:15:17.413 { 00:15:17.413 "name": "BaseBdev3", 00:15:17.413 "uuid": "dd228afe-036d-5a31-98d0-eaee14b5d500", 00:15:17.413 "is_configured": true, 00:15:17.413 "data_offset": 2048, 00:15:17.413 "data_size": 63488 00:15:17.413 } 00:15:17.413 ] 00:15:17.413 }' 00:15:17.413 13:33:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.413 13:33:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.979 13:33:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:17.979 13:33:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:17.979 [2024-07-15 13:33:57.332870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fa4d0 00:15:18.914 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.172 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:19.431 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.431 "name": "raid_bdev1", 00:15:19.431 "uuid": "d43a31f7-4918-4af7-99a3-506529a55e1b", 00:15:19.431 "strip_size_kb": 64, 00:15:19.431 "state": "online", 00:15:19.431 "raid_level": "concat", 00:15:19.431 "superblock": true, 00:15:19.431 "num_base_bdevs": 3, 00:15:19.431 "num_base_bdevs_discovered": 3, 00:15:19.431 "num_base_bdevs_operational": 3, 00:15:19.431 "base_bdevs_list": [ 00:15:19.431 { 00:15:19.431 "name": "BaseBdev1", 00:15:19.431 "uuid": "f4525c81-036e-5c80-afc5-35427832d5f4", 00:15:19.431 "is_configured": true, 00:15:19.431 "data_offset": 2048, 00:15:19.431 "data_size": 63488 00:15:19.431 }, 00:15:19.431 { 00:15:19.431 "name": "BaseBdev2", 00:15:19.431 "uuid": "952adfd0-cf3c-5388-87a1-007e52f196e7", 00:15:19.431 "is_configured": true, 00:15:19.431 "data_offset": 2048, 00:15:19.431 "data_size": 63488 00:15:19.431 }, 00:15:19.431 { 00:15:19.431 "name": "BaseBdev3", 00:15:19.431 "uuid": "dd228afe-036d-5a31-98d0-eaee14b5d500", 00:15:19.431 "is_configured": true, 00:15:19.431 "data_offset": 2048, 00:15:19.431 "data_size": 63488 00:15:19.431 } 00:15:19.431 ] 00:15:19.431 }' 00:15:19.431 13:33:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.431 13:33:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.997 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:20.255 [2024-07-15 13:33:59.562228] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:20.255 [2024-07-15 13:33:59.562266] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:20.255 [2024-07-15 13:33:59.565433] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:20.255 [2024-07-15 13:33:59.565470] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.255 [2024-07-15 13:33:59.565505] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:20.255 [2024-07-15 13:33:59.565516] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ac280 name raid_bdev1, state offline 00:15:20.255 0 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2111529 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2111529 ']' 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2111529 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2111529 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2111529' 00:15:20.255 killing process with pid 2111529 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2111529 00:15:20.255 [2024-07-15 13:33:59.628806] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:20.255 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2111529 00:15:20.255 [2024-07-15 13:33:59.649387] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Ywu4E7CgCh 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:15:20.513 00:15:20.513 real 0m6.961s 00:15:20.513 user 0m10.954s 00:15:20.513 sys 0m1.281s 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:20.513 13:33:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.513 ************************************ 00:15:20.513 END TEST raid_read_error_test 00:15:20.513 ************************************ 00:15:20.513 13:33:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:20.513 13:33:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:20.513 13:33:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:20.513 13:33:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:20.514 13:33:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:20.772 ************************************ 00:15:20.772 START TEST raid_write_error_test 00:15:20.772 ************************************ 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VeaIniQ5HH 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2112595 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2112595 /var/tmp/spdk-raid.sock 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2112595 ']' 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:20.772 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:20.772 13:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.772 [2024-07-15 13:34:00.054862] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:15:20.772 [2024-07-15 13:34:00.054946] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2112595 ] 00:15:20.772 [2024-07-15 13:34:00.185808] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:21.031 [2024-07-15 13:34:00.286626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:21.031 [2024-07-15 13:34:00.350951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.031 [2024-07-15 13:34:00.350999] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:21.597 13:34:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:21.597 13:34:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:21.597 13:34:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:21.597 13:34:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:21.854 BaseBdev1_malloc 00:15:21.854 13:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:22.111 true 00:15:22.111 13:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:22.368 [2024-07-15 13:34:01.711809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:22.368 [2024-07-15 13:34:01.711854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:22.368 [2024-07-15 13:34:01.711877] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9ab0d0 00:15:22.368 [2024-07-15 13:34:01.711890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:22.368 [2024-07-15 13:34:01.713644] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:22.368 [2024-07-15 13:34:01.713675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:22.368 BaseBdev1 00:15:22.368 13:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:22.368 13:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:22.626 BaseBdev2_malloc 00:15:22.626 13:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:22.890 true 00:15:22.890 13:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:23.146 [2024-07-15 13:34:02.442346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:23.146 [2024-07-15 13:34:02.442388] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:23.146 [2024-07-15 13:34:02.442410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9af910 00:15:23.146 [2024-07-15 13:34:02.442423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:23.146 [2024-07-15 13:34:02.443851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:23.146 [2024-07-15 13:34:02.443879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:23.146 BaseBdev2 00:15:23.146 13:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:23.146 13:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:23.403 BaseBdev3_malloc 00:15:23.403 13:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:23.661 true 00:15:23.661 13:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:23.918 [2024-07-15 13:34:03.188984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:23.918 [2024-07-15 13:34:03.189030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:23.918 [2024-07-15 13:34:03.189053] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b1bd0 00:15:23.918 [2024-07-15 13:34:03.189066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:23.918 [2024-07-15 13:34:03.190524] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:23.918 [2024-07-15 13:34:03.190554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:23.918 BaseBdev3 00:15:23.918 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:24.175 [2024-07-15 13:34:03.433661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:24.175 [2024-07-15 13:34:03.434934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.175 [2024-07-15 13:34:03.435010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.175 [2024-07-15 13:34:03.435222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9b3280 00:15:24.175 [2024-07-15 13:34:03.435234] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:24.175 [2024-07-15 13:34:03.435429] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b2e20 00:15:24.175 [2024-07-15 13:34:03.435576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9b3280 00:15:24.175 [2024-07-15 13:34:03.435586] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9b3280 00:15:24.175 [2024-07-15 13:34:03.435684] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.175 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:24.432 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.432 "name": "raid_bdev1", 00:15:24.432 "uuid": "97ee85f0-6efa-4dee-be23-0a388af08627", 00:15:24.432 "strip_size_kb": 64, 00:15:24.432 "state": "online", 00:15:24.432 "raid_level": "concat", 00:15:24.432 "superblock": true, 00:15:24.432 "num_base_bdevs": 3, 00:15:24.432 "num_base_bdevs_discovered": 3, 00:15:24.432 "num_base_bdevs_operational": 3, 00:15:24.432 "base_bdevs_list": [ 00:15:24.432 { 00:15:24.432 "name": "BaseBdev1", 00:15:24.432 "uuid": "6eb5069f-71ba-5160-9a8b-75db148b22ad", 00:15:24.432 "is_configured": true, 00:15:24.432 "data_offset": 2048, 00:15:24.432 "data_size": 63488 00:15:24.432 }, 00:15:24.432 { 00:15:24.432 "name": "BaseBdev2", 00:15:24.432 "uuid": "1d8b5253-d781-56a0-aa92-766c9e90e8b8", 00:15:24.432 "is_configured": true, 00:15:24.432 "data_offset": 2048, 00:15:24.432 "data_size": 63488 00:15:24.432 }, 00:15:24.432 { 00:15:24.432 "name": "BaseBdev3", 00:15:24.432 "uuid": "0adcc95f-bba7-54a0-8046-8f3b6edb0802", 00:15:24.432 "is_configured": true, 00:15:24.432 "data_offset": 2048, 00:15:24.432 "data_size": 63488 00:15:24.432 } 00:15:24.432 ] 00:15:24.432 }' 00:15:24.432 13:34:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.432 13:34:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.997 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:24.997 13:34:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:24.997 [2024-07-15 13:34:04.380506] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8014d0 00:15:25.930 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:26.189 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:26.189 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:26.189 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:26.189 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.190 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:26.448 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.448 "name": "raid_bdev1", 00:15:26.448 "uuid": "97ee85f0-6efa-4dee-be23-0a388af08627", 00:15:26.448 "strip_size_kb": 64, 00:15:26.448 "state": "online", 00:15:26.448 "raid_level": "concat", 00:15:26.448 "superblock": true, 00:15:26.448 "num_base_bdevs": 3, 00:15:26.448 "num_base_bdevs_discovered": 3, 00:15:26.448 "num_base_bdevs_operational": 3, 00:15:26.448 "base_bdevs_list": [ 00:15:26.448 { 00:15:26.448 "name": "BaseBdev1", 00:15:26.448 "uuid": "6eb5069f-71ba-5160-9a8b-75db148b22ad", 00:15:26.448 "is_configured": true, 00:15:26.448 "data_offset": 2048, 00:15:26.448 "data_size": 63488 00:15:26.448 }, 00:15:26.448 { 00:15:26.448 "name": "BaseBdev2", 00:15:26.448 "uuid": "1d8b5253-d781-56a0-aa92-766c9e90e8b8", 00:15:26.448 "is_configured": true, 00:15:26.448 "data_offset": 2048, 00:15:26.448 "data_size": 63488 00:15:26.448 }, 00:15:26.448 { 00:15:26.448 "name": "BaseBdev3", 00:15:26.448 "uuid": "0adcc95f-bba7-54a0-8046-8f3b6edb0802", 00:15:26.448 "is_configured": true, 00:15:26.448 "data_offset": 2048, 00:15:26.448 "data_size": 63488 00:15:26.448 } 00:15:26.448 ] 00:15:26.448 }' 00:15:26.448 13:34:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.448 13:34:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.015 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:27.274 [2024-07-15 13:34:06.614073] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:27.274 [2024-07-15 13:34:06.614114] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:27.274 [2024-07-15 13:34:06.617340] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:27.274 [2024-07-15 13:34:06.617379] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:27.274 [2024-07-15 13:34:06.617415] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:27.274 [2024-07-15 13:34:06.617427] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b3280 name raid_bdev1, state offline 00:15:27.274 0 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2112595 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2112595 ']' 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2112595 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2112595 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2112595' 00:15:27.274 killing process with pid 2112595 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2112595 00:15:27.274 [2024-07-15 13:34:06.680784] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:27.274 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2112595 00:15:27.532 [2024-07-15 13:34:06.700820] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VeaIniQ5HH 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:15:27.532 00:15:27.532 real 0m6.949s 00:15:27.532 user 0m10.995s 00:15:27.532 sys 0m1.252s 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:27.532 13:34:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.532 ************************************ 00:15:27.532 END TEST raid_write_error_test 00:15:27.532 ************************************ 00:15:27.791 13:34:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:27.791 13:34:06 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:27.791 13:34:06 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:27.792 13:34:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:27.792 13:34:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:27.792 13:34:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:27.792 ************************************ 00:15:27.792 START TEST raid_state_function_test 00:15:27.792 ************************************ 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2113650 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2113650' 00:15:27.792 Process raid pid: 2113650 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2113650 /var/tmp/spdk-raid.sock 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2113650 ']' 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:27.792 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:27.792 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.792 [2024-07-15 13:34:07.116664] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:15:27.792 [2024-07-15 13:34:07.116800] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:28.051 [2024-07-15 13:34:07.312542] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:28.051 [2024-07-15 13:34:07.413303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:28.310 [2024-07-15 13:34:07.477948] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.310 [2024-07-15 13:34:07.477977] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.568 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:28.568 13:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:28.568 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:28.568 [2024-07-15 13:34:07.980558] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:28.568 [2024-07-15 13:34:07.980601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:28.568 [2024-07-15 13:34:07.980612] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:28.568 [2024-07-15 13:34:07.980624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:28.568 [2024-07-15 13:34:07.980632] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:28.568 [2024-07-15 13:34:07.980643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.827 13:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.827 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.827 "name": "Existed_Raid", 00:15:28.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.827 "strip_size_kb": 0, 00:15:28.827 "state": "configuring", 00:15:28.827 "raid_level": "raid1", 00:15:28.827 "superblock": false, 00:15:28.827 "num_base_bdevs": 3, 00:15:28.827 "num_base_bdevs_discovered": 0, 00:15:28.827 "num_base_bdevs_operational": 3, 00:15:28.827 "base_bdevs_list": [ 00:15:28.827 { 00:15:28.827 "name": "BaseBdev1", 00:15:28.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.827 "is_configured": false, 00:15:28.827 "data_offset": 0, 00:15:28.827 "data_size": 0 00:15:28.827 }, 00:15:28.827 { 00:15:28.827 "name": "BaseBdev2", 00:15:28.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.827 "is_configured": false, 00:15:28.827 "data_offset": 0, 00:15:28.827 "data_size": 0 00:15:28.827 }, 00:15:28.827 { 00:15:28.827 "name": "BaseBdev3", 00:15:28.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.827 "is_configured": false, 00:15:28.827 "data_offset": 0, 00:15:28.827 "data_size": 0 00:15:28.827 } 00:15:28.827 ] 00:15:28.827 }' 00:15:28.827 13:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.827 13:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.761 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:30.020 [2024-07-15 13:34:09.251780] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:30.020 [2024-07-15 13:34:09.251817] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe2a80 name Existed_Raid, state configuring 00:15:30.020 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:30.279 [2024-07-15 13:34:09.500452] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:30.279 [2024-07-15 13:34:09.500482] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:30.279 [2024-07-15 13:34:09.500493] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:30.279 [2024-07-15 13:34:09.500505] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:30.279 [2024-07-15 13:34:09.500514] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:30.279 [2024-07-15 13:34:09.500525] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:30.279 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:30.537 [2024-07-15 13:34:09.772898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:30.537 BaseBdev1 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:30.538 13:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.105 13:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:31.673 [ 00:15:31.673 { 00:15:31.673 "name": "BaseBdev1", 00:15:31.673 "aliases": [ 00:15:31.673 "50ab89e8-24e8-49b0-ae89-c61f3577efbb" 00:15:31.673 ], 00:15:31.673 "product_name": "Malloc disk", 00:15:31.673 "block_size": 512, 00:15:31.673 "num_blocks": 65536, 00:15:31.673 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:31.673 "assigned_rate_limits": { 00:15:31.673 "rw_ios_per_sec": 0, 00:15:31.673 "rw_mbytes_per_sec": 0, 00:15:31.673 "r_mbytes_per_sec": 0, 00:15:31.673 "w_mbytes_per_sec": 0 00:15:31.673 }, 00:15:31.673 "claimed": true, 00:15:31.673 "claim_type": "exclusive_write", 00:15:31.673 "zoned": false, 00:15:31.673 "supported_io_types": { 00:15:31.673 "read": true, 00:15:31.673 "write": true, 00:15:31.673 "unmap": true, 00:15:31.673 "flush": true, 00:15:31.673 "reset": true, 00:15:31.673 "nvme_admin": false, 00:15:31.673 "nvme_io": false, 00:15:31.673 "nvme_io_md": false, 00:15:31.673 "write_zeroes": true, 00:15:31.673 "zcopy": true, 00:15:31.673 "get_zone_info": false, 00:15:31.673 "zone_management": false, 00:15:31.673 "zone_append": false, 00:15:31.673 "compare": false, 00:15:31.673 "compare_and_write": false, 00:15:31.673 "abort": true, 00:15:31.673 "seek_hole": false, 00:15:31.673 "seek_data": false, 00:15:31.673 "copy": true, 00:15:31.673 "nvme_iov_md": false 00:15:31.673 }, 00:15:31.673 "memory_domains": [ 00:15:31.673 { 00:15:31.673 "dma_device_id": "system", 00:15:31.673 "dma_device_type": 1 00:15:31.673 }, 00:15:31.673 { 00:15:31.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.673 "dma_device_type": 2 00:15:31.673 } 00:15:31.673 ], 00:15:31.673 "driver_specific": {} 00:15:31.673 } 00:15:31.673 ] 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.673 13:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.673 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.673 "name": "Existed_Raid", 00:15:31.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.673 "strip_size_kb": 0, 00:15:31.673 "state": "configuring", 00:15:31.673 "raid_level": "raid1", 00:15:31.673 "superblock": false, 00:15:31.673 "num_base_bdevs": 3, 00:15:31.673 "num_base_bdevs_discovered": 1, 00:15:31.673 "num_base_bdevs_operational": 3, 00:15:31.673 "base_bdevs_list": [ 00:15:31.673 { 00:15:31.673 "name": "BaseBdev1", 00:15:31.673 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:31.673 "is_configured": true, 00:15:31.673 "data_offset": 0, 00:15:31.673 "data_size": 65536 00:15:31.673 }, 00:15:31.673 { 00:15:31.673 "name": "BaseBdev2", 00:15:31.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.673 "is_configured": false, 00:15:31.673 "data_offset": 0, 00:15:31.673 "data_size": 0 00:15:31.673 }, 00:15:31.673 { 00:15:31.673 "name": "BaseBdev3", 00:15:31.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.673 "is_configured": false, 00:15:31.673 "data_offset": 0, 00:15:31.673 "data_size": 0 00:15:31.673 } 00:15:31.673 ] 00:15:31.673 }' 00:15:31.673 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.673 13:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.240 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:32.498 [2024-07-15 13:34:11.882481] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:32.498 [2024-07-15 13:34:11.882525] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe2310 name Existed_Raid, state configuring 00:15:32.498 13:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:32.757 [2024-07-15 13:34:12.135194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.757 [2024-07-15 13:34:12.136672] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:32.757 [2024-07-15 13:34:12.136711] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:32.757 [2024-07-15 13:34:12.136722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:32.757 [2024-07-15 13:34:12.136733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.757 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.325 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.325 "name": "Existed_Raid", 00:15:33.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.325 "strip_size_kb": 0, 00:15:33.325 "state": "configuring", 00:15:33.325 "raid_level": "raid1", 00:15:33.325 "superblock": false, 00:15:33.325 "num_base_bdevs": 3, 00:15:33.325 "num_base_bdevs_discovered": 1, 00:15:33.325 "num_base_bdevs_operational": 3, 00:15:33.325 "base_bdevs_list": [ 00:15:33.325 { 00:15:33.325 "name": "BaseBdev1", 00:15:33.325 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:33.325 "is_configured": true, 00:15:33.325 "data_offset": 0, 00:15:33.325 "data_size": 65536 00:15:33.325 }, 00:15:33.325 { 00:15:33.325 "name": "BaseBdev2", 00:15:33.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.325 "is_configured": false, 00:15:33.325 "data_offset": 0, 00:15:33.325 "data_size": 0 00:15:33.325 }, 00:15:33.325 { 00:15:33.325 "name": "BaseBdev3", 00:15:33.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.325 "is_configured": false, 00:15:33.325 "data_offset": 0, 00:15:33.325 "data_size": 0 00:15:33.325 } 00:15:33.326 ] 00:15:33.326 }' 00:15:33.326 13:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.326 13:34:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.893 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:34.151 [2024-07-15 13:34:13.520251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:34.151 BaseBdev2 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:34.151 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:34.409 13:34:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:34.978 [ 00:15:34.978 { 00:15:34.978 "name": "BaseBdev2", 00:15:34.978 "aliases": [ 00:15:34.978 "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091" 00:15:34.978 ], 00:15:34.978 "product_name": "Malloc disk", 00:15:34.978 "block_size": 512, 00:15:34.978 "num_blocks": 65536, 00:15:34.978 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:34.978 "assigned_rate_limits": { 00:15:34.978 "rw_ios_per_sec": 0, 00:15:34.978 "rw_mbytes_per_sec": 0, 00:15:34.978 "r_mbytes_per_sec": 0, 00:15:34.978 "w_mbytes_per_sec": 0 00:15:34.978 }, 00:15:34.978 "claimed": true, 00:15:34.978 "claim_type": "exclusive_write", 00:15:34.978 "zoned": false, 00:15:34.978 "supported_io_types": { 00:15:34.978 "read": true, 00:15:34.978 "write": true, 00:15:34.979 "unmap": true, 00:15:34.979 "flush": true, 00:15:34.979 "reset": true, 00:15:34.979 "nvme_admin": false, 00:15:34.979 "nvme_io": false, 00:15:34.979 "nvme_io_md": false, 00:15:34.979 "write_zeroes": true, 00:15:34.979 "zcopy": true, 00:15:34.979 "get_zone_info": false, 00:15:34.979 "zone_management": false, 00:15:34.979 "zone_append": false, 00:15:34.979 "compare": false, 00:15:34.979 "compare_and_write": false, 00:15:34.979 "abort": true, 00:15:34.979 "seek_hole": false, 00:15:34.979 "seek_data": false, 00:15:34.979 "copy": true, 00:15:34.979 "nvme_iov_md": false 00:15:34.979 }, 00:15:34.979 "memory_domains": [ 00:15:34.979 { 00:15:34.979 "dma_device_id": "system", 00:15:34.979 "dma_device_type": 1 00:15:34.979 }, 00:15:34.979 { 00:15:34.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.979 "dma_device_type": 2 00:15:34.979 } 00:15:34.979 ], 00:15:34.979 "driver_specific": {} 00:15:34.979 } 00:15:34.979 ] 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.979 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.546 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.546 "name": "Existed_Raid", 00:15:35.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.546 "strip_size_kb": 0, 00:15:35.546 "state": "configuring", 00:15:35.546 "raid_level": "raid1", 00:15:35.546 "superblock": false, 00:15:35.546 "num_base_bdevs": 3, 00:15:35.546 "num_base_bdevs_discovered": 2, 00:15:35.546 "num_base_bdevs_operational": 3, 00:15:35.546 "base_bdevs_list": [ 00:15:35.546 { 00:15:35.546 "name": "BaseBdev1", 00:15:35.546 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:35.546 "is_configured": true, 00:15:35.546 "data_offset": 0, 00:15:35.546 "data_size": 65536 00:15:35.546 }, 00:15:35.546 { 00:15:35.546 "name": "BaseBdev2", 00:15:35.546 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:35.546 "is_configured": true, 00:15:35.546 "data_offset": 0, 00:15:35.546 "data_size": 65536 00:15:35.546 }, 00:15:35.546 { 00:15:35.546 "name": "BaseBdev3", 00:15:35.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.546 "is_configured": false, 00:15:35.546 "data_offset": 0, 00:15:35.546 "data_size": 0 00:15:35.546 } 00:15:35.546 ] 00:15:35.546 }' 00:15:35.546 13:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.546 13:34:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.112 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:36.680 [2024-07-15 13:34:15.903577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:36.680 [2024-07-15 13:34:15.903625] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe3400 00:15:36.680 [2024-07-15 13:34:15.903635] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:36.680 [2024-07-15 13:34:15.903902] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe2ef0 00:15:36.680 [2024-07-15 13:34:15.904061] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe3400 00:15:36.680 [2024-07-15 13:34:15.904072] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbe3400 00:15:36.680 [2024-07-15 13:34:15.904263] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.680 BaseBdev3 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:36.680 13:34:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.248 13:34:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:37.541 [ 00:15:37.541 { 00:15:37.541 "name": "BaseBdev3", 00:15:37.541 "aliases": [ 00:15:37.541 "0081d133-4770-4ae9-9f4c-ced17f986871" 00:15:37.541 ], 00:15:37.541 "product_name": "Malloc disk", 00:15:37.541 "block_size": 512, 00:15:37.541 "num_blocks": 65536, 00:15:37.541 "uuid": "0081d133-4770-4ae9-9f4c-ced17f986871", 00:15:37.541 "assigned_rate_limits": { 00:15:37.541 "rw_ios_per_sec": 0, 00:15:37.541 "rw_mbytes_per_sec": 0, 00:15:37.541 "r_mbytes_per_sec": 0, 00:15:37.541 "w_mbytes_per_sec": 0 00:15:37.541 }, 00:15:37.541 "claimed": true, 00:15:37.541 "claim_type": "exclusive_write", 00:15:37.541 "zoned": false, 00:15:37.541 "supported_io_types": { 00:15:37.541 "read": true, 00:15:37.541 "write": true, 00:15:37.541 "unmap": true, 00:15:37.541 "flush": true, 00:15:37.541 "reset": true, 00:15:37.541 "nvme_admin": false, 00:15:37.541 "nvme_io": false, 00:15:37.541 "nvme_io_md": false, 00:15:37.541 "write_zeroes": true, 00:15:37.541 "zcopy": true, 00:15:37.541 "get_zone_info": false, 00:15:37.541 "zone_management": false, 00:15:37.541 "zone_append": false, 00:15:37.541 "compare": false, 00:15:37.541 "compare_and_write": false, 00:15:37.541 "abort": true, 00:15:37.541 "seek_hole": false, 00:15:37.541 "seek_data": false, 00:15:37.541 "copy": true, 00:15:37.541 "nvme_iov_md": false 00:15:37.541 }, 00:15:37.541 "memory_domains": [ 00:15:37.541 { 00:15:37.541 "dma_device_id": "system", 00:15:37.541 "dma_device_type": 1 00:15:37.541 }, 00:15:37.541 { 00:15:37.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.541 "dma_device_type": 2 00:15:37.541 } 00:15:37.541 ], 00:15:37.541 "driver_specific": {} 00:15:37.541 } 00:15:37.541 ] 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.541 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.800 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.800 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.800 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.800 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.800 13:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.800 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.800 "name": "Existed_Raid", 00:15:37.800 "uuid": "35cc6557-368f-44b9-9ae6-87cfca0dd1b7", 00:15:37.800 "strip_size_kb": 0, 00:15:37.800 "state": "online", 00:15:37.800 "raid_level": "raid1", 00:15:37.800 "superblock": false, 00:15:37.800 "num_base_bdevs": 3, 00:15:37.800 "num_base_bdevs_discovered": 3, 00:15:37.800 "num_base_bdevs_operational": 3, 00:15:37.800 "base_bdevs_list": [ 00:15:37.800 { 00:15:37.800 "name": "BaseBdev1", 00:15:37.800 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:37.800 "is_configured": true, 00:15:37.800 "data_offset": 0, 00:15:37.800 "data_size": 65536 00:15:37.800 }, 00:15:37.800 { 00:15:37.800 "name": "BaseBdev2", 00:15:37.800 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:37.800 "is_configured": true, 00:15:37.800 "data_offset": 0, 00:15:37.800 "data_size": 65536 00:15:37.800 }, 00:15:37.800 { 00:15:37.800 "name": "BaseBdev3", 00:15:37.800 "uuid": "0081d133-4770-4ae9-9f4c-ced17f986871", 00:15:37.800 "is_configured": true, 00:15:37.800 "data_offset": 0, 00:15:37.800 "data_size": 65536 00:15:37.800 } 00:15:37.800 ] 00:15:37.800 }' 00:15:37.800 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.800 13:34:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:38.735 13:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.735 [2024-07-15 13:34:18.033531] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.735 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.735 "name": "Existed_Raid", 00:15:38.735 "aliases": [ 00:15:38.735 "35cc6557-368f-44b9-9ae6-87cfca0dd1b7" 00:15:38.735 ], 00:15:38.735 "product_name": "Raid Volume", 00:15:38.735 "block_size": 512, 00:15:38.735 "num_blocks": 65536, 00:15:38.735 "uuid": "35cc6557-368f-44b9-9ae6-87cfca0dd1b7", 00:15:38.735 "assigned_rate_limits": { 00:15:38.735 "rw_ios_per_sec": 0, 00:15:38.735 "rw_mbytes_per_sec": 0, 00:15:38.735 "r_mbytes_per_sec": 0, 00:15:38.735 "w_mbytes_per_sec": 0 00:15:38.735 }, 00:15:38.735 "claimed": false, 00:15:38.735 "zoned": false, 00:15:38.735 "supported_io_types": { 00:15:38.735 "read": true, 00:15:38.735 "write": true, 00:15:38.735 "unmap": false, 00:15:38.735 "flush": false, 00:15:38.735 "reset": true, 00:15:38.735 "nvme_admin": false, 00:15:38.735 "nvme_io": false, 00:15:38.735 "nvme_io_md": false, 00:15:38.735 "write_zeroes": true, 00:15:38.735 "zcopy": false, 00:15:38.735 "get_zone_info": false, 00:15:38.735 "zone_management": false, 00:15:38.735 "zone_append": false, 00:15:38.735 "compare": false, 00:15:38.735 "compare_and_write": false, 00:15:38.735 "abort": false, 00:15:38.735 "seek_hole": false, 00:15:38.735 "seek_data": false, 00:15:38.735 "copy": false, 00:15:38.735 "nvme_iov_md": false 00:15:38.735 }, 00:15:38.735 "memory_domains": [ 00:15:38.735 { 00:15:38.735 "dma_device_id": "system", 00:15:38.735 "dma_device_type": 1 00:15:38.735 }, 00:15:38.735 { 00:15:38.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.735 "dma_device_type": 2 00:15:38.735 }, 00:15:38.735 { 00:15:38.735 "dma_device_id": "system", 00:15:38.735 "dma_device_type": 1 00:15:38.735 }, 00:15:38.735 { 00:15:38.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.735 "dma_device_type": 2 00:15:38.735 }, 00:15:38.735 { 00:15:38.735 "dma_device_id": "system", 00:15:38.735 "dma_device_type": 1 00:15:38.735 }, 00:15:38.735 { 00:15:38.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.735 "dma_device_type": 2 00:15:38.735 } 00:15:38.735 ], 00:15:38.735 "driver_specific": { 00:15:38.735 "raid": { 00:15:38.735 "uuid": "35cc6557-368f-44b9-9ae6-87cfca0dd1b7", 00:15:38.735 "strip_size_kb": 0, 00:15:38.735 "state": "online", 00:15:38.735 "raid_level": "raid1", 00:15:38.735 "superblock": false, 00:15:38.735 "num_base_bdevs": 3, 00:15:38.735 "num_base_bdevs_discovered": 3, 00:15:38.735 "num_base_bdevs_operational": 3, 00:15:38.735 "base_bdevs_list": [ 00:15:38.735 { 00:15:38.735 "name": "BaseBdev1", 00:15:38.735 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:38.735 "is_configured": true, 00:15:38.735 "data_offset": 0, 00:15:38.735 "data_size": 65536 00:15:38.736 }, 00:15:38.736 { 00:15:38.736 "name": "BaseBdev2", 00:15:38.736 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:38.736 "is_configured": true, 00:15:38.736 "data_offset": 0, 00:15:38.736 "data_size": 65536 00:15:38.736 }, 00:15:38.736 { 00:15:38.736 "name": "BaseBdev3", 00:15:38.736 "uuid": "0081d133-4770-4ae9-9f4c-ced17f986871", 00:15:38.736 "is_configured": true, 00:15:38.736 "data_offset": 0, 00:15:38.736 "data_size": 65536 00:15:38.736 } 00:15:38.736 ] 00:15:38.736 } 00:15:38.736 } 00:15:38.736 }' 00:15:38.736 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.736 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:38.736 BaseBdev2 00:15:38.736 BaseBdev3' 00:15:38.736 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.736 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:38.736 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.995 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.995 "name": "BaseBdev1", 00:15:38.995 "aliases": [ 00:15:38.995 "50ab89e8-24e8-49b0-ae89-c61f3577efbb" 00:15:38.995 ], 00:15:38.995 "product_name": "Malloc disk", 00:15:38.995 "block_size": 512, 00:15:38.995 "num_blocks": 65536, 00:15:38.995 "uuid": "50ab89e8-24e8-49b0-ae89-c61f3577efbb", 00:15:38.995 "assigned_rate_limits": { 00:15:38.995 "rw_ios_per_sec": 0, 00:15:38.995 "rw_mbytes_per_sec": 0, 00:15:38.995 "r_mbytes_per_sec": 0, 00:15:38.995 "w_mbytes_per_sec": 0 00:15:38.995 }, 00:15:38.995 "claimed": true, 00:15:38.995 "claim_type": "exclusive_write", 00:15:38.995 "zoned": false, 00:15:38.995 "supported_io_types": { 00:15:38.995 "read": true, 00:15:38.995 "write": true, 00:15:38.995 "unmap": true, 00:15:38.995 "flush": true, 00:15:38.995 "reset": true, 00:15:38.995 "nvme_admin": false, 00:15:38.995 "nvme_io": false, 00:15:38.995 "nvme_io_md": false, 00:15:38.995 "write_zeroes": true, 00:15:38.995 "zcopy": true, 00:15:38.995 "get_zone_info": false, 00:15:38.995 "zone_management": false, 00:15:38.995 "zone_append": false, 00:15:38.995 "compare": false, 00:15:38.995 "compare_and_write": false, 00:15:38.995 "abort": true, 00:15:38.995 "seek_hole": false, 00:15:38.995 "seek_data": false, 00:15:38.995 "copy": true, 00:15:38.995 "nvme_iov_md": false 00:15:38.995 }, 00:15:38.995 "memory_domains": [ 00:15:38.995 { 00:15:38.995 "dma_device_id": "system", 00:15:38.995 "dma_device_type": 1 00:15:38.995 }, 00:15:38.995 { 00:15:38.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.995 "dma_device_type": 2 00:15:38.995 } 00:15:38.995 ], 00:15:38.995 "driver_specific": {} 00:15:38.995 }' 00:15:38.995 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.995 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.253 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.512 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.512 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.512 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:39.512 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.772 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.772 "name": "BaseBdev2", 00:15:39.772 "aliases": [ 00:15:39.772 "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091" 00:15:39.772 ], 00:15:39.772 "product_name": "Malloc disk", 00:15:39.772 "block_size": 512, 00:15:39.772 "num_blocks": 65536, 00:15:39.772 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:39.772 "assigned_rate_limits": { 00:15:39.772 "rw_ios_per_sec": 0, 00:15:39.772 "rw_mbytes_per_sec": 0, 00:15:39.772 "r_mbytes_per_sec": 0, 00:15:39.772 "w_mbytes_per_sec": 0 00:15:39.772 }, 00:15:39.772 "claimed": true, 00:15:39.772 "claim_type": "exclusive_write", 00:15:39.772 "zoned": false, 00:15:39.772 "supported_io_types": { 00:15:39.772 "read": true, 00:15:39.772 "write": true, 00:15:39.772 "unmap": true, 00:15:39.772 "flush": true, 00:15:39.772 "reset": true, 00:15:39.772 "nvme_admin": false, 00:15:39.772 "nvme_io": false, 00:15:39.772 "nvme_io_md": false, 00:15:39.772 "write_zeroes": true, 00:15:39.772 "zcopy": true, 00:15:39.772 "get_zone_info": false, 00:15:39.772 "zone_management": false, 00:15:39.772 "zone_append": false, 00:15:39.772 "compare": false, 00:15:39.772 "compare_and_write": false, 00:15:39.772 "abort": true, 00:15:39.772 "seek_hole": false, 00:15:39.772 "seek_data": false, 00:15:39.772 "copy": true, 00:15:39.772 "nvme_iov_md": false 00:15:39.772 }, 00:15:39.772 "memory_domains": [ 00:15:39.772 { 00:15:39.772 "dma_device_id": "system", 00:15:39.772 "dma_device_type": 1 00:15:39.772 }, 00:15:39.772 { 00:15:39.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.772 "dma_device_type": 2 00:15:39.772 } 00:15:39.772 ], 00:15:39.772 "driver_specific": {} 00:15:39.772 }' 00:15:39.772 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.772 13:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.772 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.031 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.031 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.031 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.031 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:40.031 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.291 "name": "BaseBdev3", 00:15:40.291 "aliases": [ 00:15:40.291 "0081d133-4770-4ae9-9f4c-ced17f986871" 00:15:40.291 ], 00:15:40.291 "product_name": "Malloc disk", 00:15:40.291 "block_size": 512, 00:15:40.291 "num_blocks": 65536, 00:15:40.291 "uuid": "0081d133-4770-4ae9-9f4c-ced17f986871", 00:15:40.291 "assigned_rate_limits": { 00:15:40.291 "rw_ios_per_sec": 0, 00:15:40.291 "rw_mbytes_per_sec": 0, 00:15:40.291 "r_mbytes_per_sec": 0, 00:15:40.291 "w_mbytes_per_sec": 0 00:15:40.291 }, 00:15:40.291 "claimed": true, 00:15:40.291 "claim_type": "exclusive_write", 00:15:40.291 "zoned": false, 00:15:40.291 "supported_io_types": { 00:15:40.291 "read": true, 00:15:40.291 "write": true, 00:15:40.291 "unmap": true, 00:15:40.291 "flush": true, 00:15:40.291 "reset": true, 00:15:40.291 "nvme_admin": false, 00:15:40.291 "nvme_io": false, 00:15:40.291 "nvme_io_md": false, 00:15:40.291 "write_zeroes": true, 00:15:40.291 "zcopy": true, 00:15:40.291 "get_zone_info": false, 00:15:40.291 "zone_management": false, 00:15:40.291 "zone_append": false, 00:15:40.291 "compare": false, 00:15:40.291 "compare_and_write": false, 00:15:40.291 "abort": true, 00:15:40.291 "seek_hole": false, 00:15:40.291 "seek_data": false, 00:15:40.291 "copy": true, 00:15:40.291 "nvme_iov_md": false 00:15:40.291 }, 00:15:40.291 "memory_domains": [ 00:15:40.291 { 00:15:40.291 "dma_device_id": "system", 00:15:40.291 "dma_device_type": 1 00:15:40.291 }, 00:15:40.291 { 00:15:40.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.291 "dma_device_type": 2 00:15:40.291 } 00:15:40.291 ], 00:15:40.291 "driver_specific": {} 00:15:40.291 }' 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.291 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.553 13:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:40.812 [2024-07-15 13:34:20.110815] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.812 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:41.072 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.072 "name": "Existed_Raid", 00:15:41.072 "uuid": "35cc6557-368f-44b9-9ae6-87cfca0dd1b7", 00:15:41.072 "strip_size_kb": 0, 00:15:41.072 "state": "online", 00:15:41.072 "raid_level": "raid1", 00:15:41.072 "superblock": false, 00:15:41.072 "num_base_bdevs": 3, 00:15:41.072 "num_base_bdevs_discovered": 2, 00:15:41.072 "num_base_bdevs_operational": 2, 00:15:41.072 "base_bdevs_list": [ 00:15:41.072 { 00:15:41.072 "name": null, 00:15:41.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.072 "is_configured": false, 00:15:41.072 "data_offset": 0, 00:15:41.072 "data_size": 65536 00:15:41.072 }, 00:15:41.072 { 00:15:41.072 "name": "BaseBdev2", 00:15:41.072 "uuid": "f9cd2b27-dea5-4a6b-9e03-b7b24a6e9091", 00:15:41.072 "is_configured": true, 00:15:41.072 "data_offset": 0, 00:15:41.072 "data_size": 65536 00:15:41.072 }, 00:15:41.072 { 00:15:41.072 "name": "BaseBdev3", 00:15:41.072 "uuid": "0081d133-4770-4ae9-9f4c-ced17f986871", 00:15:41.072 "is_configured": true, 00:15:41.072 "data_offset": 0, 00:15:41.072 "data_size": 65536 00:15:41.072 } 00:15:41.072 ] 00:15:41.072 }' 00:15:41.072 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.072 13:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.638 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:41.638 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.638 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.638 13:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:41.895 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:41.895 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:41.895 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:42.154 [2024-07-15 13:34:21.388569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:42.154 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:42.154 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:42.154 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.154 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:42.412 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:42.412 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:42.412 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:42.670 [2024-07-15 13:34:21.899652] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:42.670 [2024-07-15 13:34:21.899741] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.670 [2024-07-15 13:34:21.919157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.670 [2024-07-15 13:34:21.919194] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.670 [2024-07-15 13:34:21.919213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe3400 name Existed_Raid, state offline 00:15:42.670 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:42.670 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:42.670 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.670 13:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:42.928 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:42.928 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:42.928 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:42.928 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:42.928 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:42.929 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:43.187 BaseBdev2 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.187 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:43.446 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:43.704 [ 00:15:43.704 { 00:15:43.704 "name": "BaseBdev2", 00:15:43.704 "aliases": [ 00:15:43.704 "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6" 00:15:43.704 ], 00:15:43.704 "product_name": "Malloc disk", 00:15:43.704 "block_size": 512, 00:15:43.704 "num_blocks": 65536, 00:15:43.704 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:43.704 "assigned_rate_limits": { 00:15:43.704 "rw_ios_per_sec": 0, 00:15:43.704 "rw_mbytes_per_sec": 0, 00:15:43.704 "r_mbytes_per_sec": 0, 00:15:43.704 "w_mbytes_per_sec": 0 00:15:43.704 }, 00:15:43.704 "claimed": false, 00:15:43.704 "zoned": false, 00:15:43.704 "supported_io_types": { 00:15:43.704 "read": true, 00:15:43.704 "write": true, 00:15:43.704 "unmap": true, 00:15:43.704 "flush": true, 00:15:43.704 "reset": true, 00:15:43.704 "nvme_admin": false, 00:15:43.704 "nvme_io": false, 00:15:43.704 "nvme_io_md": false, 00:15:43.704 "write_zeroes": true, 00:15:43.704 "zcopy": true, 00:15:43.704 "get_zone_info": false, 00:15:43.704 "zone_management": false, 00:15:43.704 "zone_append": false, 00:15:43.704 "compare": false, 00:15:43.704 "compare_and_write": false, 00:15:43.704 "abort": true, 00:15:43.704 "seek_hole": false, 00:15:43.704 "seek_data": false, 00:15:43.704 "copy": true, 00:15:43.704 "nvme_iov_md": false 00:15:43.704 }, 00:15:43.704 "memory_domains": [ 00:15:43.704 { 00:15:43.704 "dma_device_id": "system", 00:15:43.704 "dma_device_type": 1 00:15:43.704 }, 00:15:43.704 { 00:15:43.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.704 "dma_device_type": 2 00:15:43.704 } 00:15:43.704 ], 00:15:43.704 "driver_specific": {} 00:15:43.704 } 00:15:43.704 ] 00:15:43.704 13:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:43.704 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:43.704 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:43.704 13:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:43.962 BaseBdev3 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.962 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:44.221 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:44.221 [ 00:15:44.221 { 00:15:44.221 "name": "BaseBdev3", 00:15:44.221 "aliases": [ 00:15:44.221 "3ab9cef3-dcdf-4783-aa65-26a3898371b2" 00:15:44.221 ], 00:15:44.221 "product_name": "Malloc disk", 00:15:44.221 "block_size": 512, 00:15:44.221 "num_blocks": 65536, 00:15:44.221 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:44.221 "assigned_rate_limits": { 00:15:44.221 "rw_ios_per_sec": 0, 00:15:44.221 "rw_mbytes_per_sec": 0, 00:15:44.221 "r_mbytes_per_sec": 0, 00:15:44.221 "w_mbytes_per_sec": 0 00:15:44.221 }, 00:15:44.221 "claimed": false, 00:15:44.221 "zoned": false, 00:15:44.221 "supported_io_types": { 00:15:44.221 "read": true, 00:15:44.221 "write": true, 00:15:44.221 "unmap": true, 00:15:44.221 "flush": true, 00:15:44.221 "reset": true, 00:15:44.221 "nvme_admin": false, 00:15:44.221 "nvme_io": false, 00:15:44.221 "nvme_io_md": false, 00:15:44.221 "write_zeroes": true, 00:15:44.221 "zcopy": true, 00:15:44.221 "get_zone_info": false, 00:15:44.221 "zone_management": false, 00:15:44.221 "zone_append": false, 00:15:44.221 "compare": false, 00:15:44.221 "compare_and_write": false, 00:15:44.221 "abort": true, 00:15:44.221 "seek_hole": false, 00:15:44.221 "seek_data": false, 00:15:44.221 "copy": true, 00:15:44.221 "nvme_iov_md": false 00:15:44.221 }, 00:15:44.221 "memory_domains": [ 00:15:44.221 { 00:15:44.221 "dma_device_id": "system", 00:15:44.221 "dma_device_type": 1 00:15:44.221 }, 00:15:44.221 { 00:15:44.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.221 "dma_device_type": 2 00:15:44.221 } 00:15:44.221 ], 00:15:44.221 "driver_specific": {} 00:15:44.221 } 00:15:44.221 ] 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:44.479 [2024-07-15 13:34:23.878460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.479 [2024-07-15 13:34:23.878505] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.479 [2024-07-15 13:34:23.878523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:44.479 [2024-07-15 13:34:23.879867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.479 13:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.736 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.736 "name": "Existed_Raid", 00:15:44.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.736 "strip_size_kb": 0, 00:15:44.736 "state": "configuring", 00:15:44.736 "raid_level": "raid1", 00:15:44.736 "superblock": false, 00:15:44.736 "num_base_bdevs": 3, 00:15:44.736 "num_base_bdevs_discovered": 2, 00:15:44.736 "num_base_bdevs_operational": 3, 00:15:44.736 "base_bdevs_list": [ 00:15:44.736 { 00:15:44.736 "name": "BaseBdev1", 00:15:44.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.736 "is_configured": false, 00:15:44.736 "data_offset": 0, 00:15:44.736 "data_size": 0 00:15:44.736 }, 00:15:44.736 { 00:15:44.736 "name": "BaseBdev2", 00:15:44.736 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:44.736 "is_configured": true, 00:15:44.736 "data_offset": 0, 00:15:44.736 "data_size": 65536 00:15:44.736 }, 00:15:44.736 { 00:15:44.736 "name": "BaseBdev3", 00:15:44.737 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:44.737 "is_configured": true, 00:15:44.737 "data_offset": 0, 00:15:44.737 "data_size": 65536 00:15:44.737 } 00:15:44.737 ] 00:15:44.737 }' 00:15:44.737 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.737 13:34:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.301 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:45.559 [2024-07-15 13:34:24.901154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.559 13:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.818 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.818 "name": "Existed_Raid", 00:15:45.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.818 "strip_size_kb": 0, 00:15:45.818 "state": "configuring", 00:15:45.818 "raid_level": "raid1", 00:15:45.818 "superblock": false, 00:15:45.818 "num_base_bdevs": 3, 00:15:45.818 "num_base_bdevs_discovered": 1, 00:15:45.818 "num_base_bdevs_operational": 3, 00:15:45.818 "base_bdevs_list": [ 00:15:45.818 { 00:15:45.818 "name": "BaseBdev1", 00:15:45.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.818 "is_configured": false, 00:15:45.818 "data_offset": 0, 00:15:45.818 "data_size": 0 00:15:45.818 }, 00:15:45.818 { 00:15:45.818 "name": null, 00:15:45.818 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:45.818 "is_configured": false, 00:15:45.818 "data_offset": 0, 00:15:45.818 "data_size": 65536 00:15:45.818 }, 00:15:45.818 { 00:15:45.818 "name": "BaseBdev3", 00:15:45.818 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:45.818 "is_configured": true, 00:15:45.818 "data_offset": 0, 00:15:45.818 "data_size": 65536 00:15:45.818 } 00:15:45.818 ] 00:15:45.818 }' 00:15:45.818 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.818 13:34:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.383 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.383 13:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:46.642 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:46.642 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:46.900 [2024-07-15 13:34:26.289089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.900 BaseBdev1 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:46.900 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.158 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:47.415 [ 00:15:47.415 { 00:15:47.415 "name": "BaseBdev1", 00:15:47.415 "aliases": [ 00:15:47.415 "ce591d75-e88b-4563-95c0-c0f214ecf8bb" 00:15:47.415 ], 00:15:47.415 "product_name": "Malloc disk", 00:15:47.415 "block_size": 512, 00:15:47.415 "num_blocks": 65536, 00:15:47.415 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:47.415 "assigned_rate_limits": { 00:15:47.415 "rw_ios_per_sec": 0, 00:15:47.415 "rw_mbytes_per_sec": 0, 00:15:47.415 "r_mbytes_per_sec": 0, 00:15:47.415 "w_mbytes_per_sec": 0 00:15:47.415 }, 00:15:47.415 "claimed": true, 00:15:47.415 "claim_type": "exclusive_write", 00:15:47.415 "zoned": false, 00:15:47.415 "supported_io_types": { 00:15:47.415 "read": true, 00:15:47.415 "write": true, 00:15:47.415 "unmap": true, 00:15:47.415 "flush": true, 00:15:47.415 "reset": true, 00:15:47.415 "nvme_admin": false, 00:15:47.415 "nvme_io": false, 00:15:47.415 "nvme_io_md": false, 00:15:47.415 "write_zeroes": true, 00:15:47.415 "zcopy": true, 00:15:47.415 "get_zone_info": false, 00:15:47.415 "zone_management": false, 00:15:47.415 "zone_append": false, 00:15:47.415 "compare": false, 00:15:47.415 "compare_and_write": false, 00:15:47.415 "abort": true, 00:15:47.415 "seek_hole": false, 00:15:47.415 "seek_data": false, 00:15:47.415 "copy": true, 00:15:47.415 "nvme_iov_md": false 00:15:47.415 }, 00:15:47.415 "memory_domains": [ 00:15:47.415 { 00:15:47.415 "dma_device_id": "system", 00:15:47.415 "dma_device_type": 1 00:15:47.415 }, 00:15:47.415 { 00:15:47.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.415 "dma_device_type": 2 00:15:47.415 } 00:15:47.415 ], 00:15:47.415 "driver_specific": {} 00:15:47.415 } 00:15:47.415 ] 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.415 13:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.673 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.673 "name": "Existed_Raid", 00:15:47.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.673 "strip_size_kb": 0, 00:15:47.673 "state": "configuring", 00:15:47.673 "raid_level": "raid1", 00:15:47.673 "superblock": false, 00:15:47.673 "num_base_bdevs": 3, 00:15:47.673 "num_base_bdevs_discovered": 2, 00:15:47.673 "num_base_bdevs_operational": 3, 00:15:47.673 "base_bdevs_list": [ 00:15:47.673 { 00:15:47.673 "name": "BaseBdev1", 00:15:47.673 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:47.673 "is_configured": true, 00:15:47.673 "data_offset": 0, 00:15:47.673 "data_size": 65536 00:15:47.673 }, 00:15:47.673 { 00:15:47.673 "name": null, 00:15:47.673 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:47.673 "is_configured": false, 00:15:47.673 "data_offset": 0, 00:15:47.673 "data_size": 65536 00:15:47.673 }, 00:15:47.673 { 00:15:47.673 "name": "BaseBdev3", 00:15:47.673 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:47.673 "is_configured": true, 00:15:47.673 "data_offset": 0, 00:15:47.673 "data_size": 65536 00:15:47.673 } 00:15:47.673 ] 00:15:47.673 }' 00:15:47.673 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.673 13:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.607 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:48.607 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.607 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:48.607 13:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:48.865 [2024-07-15 13:34:28.174160] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.865 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.124 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.124 "name": "Existed_Raid", 00:15:49.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.124 "strip_size_kb": 0, 00:15:49.124 "state": "configuring", 00:15:49.124 "raid_level": "raid1", 00:15:49.124 "superblock": false, 00:15:49.124 "num_base_bdevs": 3, 00:15:49.124 "num_base_bdevs_discovered": 1, 00:15:49.124 "num_base_bdevs_operational": 3, 00:15:49.124 "base_bdevs_list": [ 00:15:49.124 { 00:15:49.124 "name": "BaseBdev1", 00:15:49.124 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:49.124 "is_configured": true, 00:15:49.124 "data_offset": 0, 00:15:49.124 "data_size": 65536 00:15:49.124 }, 00:15:49.124 { 00:15:49.124 "name": null, 00:15:49.124 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:49.124 "is_configured": false, 00:15:49.124 "data_offset": 0, 00:15:49.124 "data_size": 65536 00:15:49.124 }, 00:15:49.124 { 00:15:49.124 "name": null, 00:15:49.124 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:49.124 "is_configured": false, 00:15:49.124 "data_offset": 0, 00:15:49.124 "data_size": 65536 00:15:49.124 } 00:15:49.124 ] 00:15:49.124 }' 00:15:49.124 13:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.124 13:34:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.691 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.691 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:49.949 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:49.949 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:50.208 [2024-07-15 13:34:29.497708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.208 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.467 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.467 "name": "Existed_Raid", 00:15:50.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.467 "strip_size_kb": 0, 00:15:50.467 "state": "configuring", 00:15:50.467 "raid_level": "raid1", 00:15:50.467 "superblock": false, 00:15:50.467 "num_base_bdevs": 3, 00:15:50.467 "num_base_bdevs_discovered": 2, 00:15:50.467 "num_base_bdevs_operational": 3, 00:15:50.467 "base_bdevs_list": [ 00:15:50.467 { 00:15:50.467 "name": "BaseBdev1", 00:15:50.467 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:50.467 "is_configured": true, 00:15:50.467 "data_offset": 0, 00:15:50.467 "data_size": 65536 00:15:50.467 }, 00:15:50.467 { 00:15:50.467 "name": null, 00:15:50.467 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:50.467 "is_configured": false, 00:15:50.467 "data_offset": 0, 00:15:50.467 "data_size": 65536 00:15:50.467 }, 00:15:50.467 { 00:15:50.467 "name": "BaseBdev3", 00:15:50.467 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:50.467 "is_configured": true, 00:15:50.467 "data_offset": 0, 00:15:50.467 "data_size": 65536 00:15:50.467 } 00:15:50.467 ] 00:15:50.467 }' 00:15:50.467 13:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.467 13:34:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.033 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.033 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:51.290 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:51.290 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:51.591 [2024-07-15 13:34:30.769102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.591 13:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.874 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.874 "name": "Existed_Raid", 00:15:51.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.874 "strip_size_kb": 0, 00:15:51.874 "state": "configuring", 00:15:51.874 "raid_level": "raid1", 00:15:51.874 "superblock": false, 00:15:51.874 "num_base_bdevs": 3, 00:15:51.874 "num_base_bdevs_discovered": 1, 00:15:51.874 "num_base_bdevs_operational": 3, 00:15:51.874 "base_bdevs_list": [ 00:15:51.874 { 00:15:51.874 "name": null, 00:15:51.874 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:51.874 "is_configured": false, 00:15:51.874 "data_offset": 0, 00:15:51.874 "data_size": 65536 00:15:51.874 }, 00:15:51.874 { 00:15:51.874 "name": null, 00:15:51.874 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:51.874 "is_configured": false, 00:15:51.874 "data_offset": 0, 00:15:51.874 "data_size": 65536 00:15:51.874 }, 00:15:51.874 { 00:15:51.874 "name": "BaseBdev3", 00:15:51.874 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:51.874 "is_configured": true, 00:15:51.874 "data_offset": 0, 00:15:51.874 "data_size": 65536 00:15:51.874 } 00:15:51.874 ] 00:15:51.874 }' 00:15:51.874 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.874 13:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.440 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.440 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:52.698 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:52.698 13:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:52.698 [2024-07-15 13:34:32.048560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.698 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.956 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.956 "name": "Existed_Raid", 00:15:52.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.956 "strip_size_kb": 0, 00:15:52.956 "state": "configuring", 00:15:52.956 "raid_level": "raid1", 00:15:52.956 "superblock": false, 00:15:52.956 "num_base_bdevs": 3, 00:15:52.956 "num_base_bdevs_discovered": 2, 00:15:52.956 "num_base_bdevs_operational": 3, 00:15:52.956 "base_bdevs_list": [ 00:15:52.956 { 00:15:52.956 "name": null, 00:15:52.956 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:52.956 "is_configured": false, 00:15:52.956 "data_offset": 0, 00:15:52.956 "data_size": 65536 00:15:52.956 }, 00:15:52.956 { 00:15:52.956 "name": "BaseBdev2", 00:15:52.956 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:52.956 "is_configured": true, 00:15:52.956 "data_offset": 0, 00:15:52.956 "data_size": 65536 00:15:52.956 }, 00:15:52.956 { 00:15:52.956 "name": "BaseBdev3", 00:15:52.956 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:52.956 "is_configured": true, 00:15:52.956 "data_offset": 0, 00:15:52.956 "data_size": 65536 00:15:52.956 } 00:15:52.956 ] 00:15:52.956 }' 00:15:52.956 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.956 13:34:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.523 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.523 13:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:53.782 13:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:53.782 13:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.782 13:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:54.039 13:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ce591d75-e88b-4563-95c0-c0f214ecf8bb 00:15:54.296 [2024-07-15 13:34:33.561682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:54.296 [2024-07-15 13:34:33.561725] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe6e40 00:15:54.296 [2024-07-15 13:34:33.561734] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:54.296 [2024-07-15 13:34:33.561946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe3e60 00:15:54.296 [2024-07-15 13:34:33.562087] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe6e40 00:15:54.296 [2024-07-15 13:34:33.562097] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbe6e40 00:15:54.296 [2024-07-15 13:34:33.562284] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.296 NewBaseBdev 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.296 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:54.554 13:34:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:54.813 [ 00:15:54.813 { 00:15:54.813 "name": "NewBaseBdev", 00:15:54.813 "aliases": [ 00:15:54.813 "ce591d75-e88b-4563-95c0-c0f214ecf8bb" 00:15:54.813 ], 00:15:54.813 "product_name": "Malloc disk", 00:15:54.813 "block_size": 512, 00:15:54.813 "num_blocks": 65536, 00:15:54.813 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:54.813 "assigned_rate_limits": { 00:15:54.813 "rw_ios_per_sec": 0, 00:15:54.813 "rw_mbytes_per_sec": 0, 00:15:54.813 "r_mbytes_per_sec": 0, 00:15:54.813 "w_mbytes_per_sec": 0 00:15:54.813 }, 00:15:54.813 "claimed": true, 00:15:54.813 "claim_type": "exclusive_write", 00:15:54.813 "zoned": false, 00:15:54.813 "supported_io_types": { 00:15:54.813 "read": true, 00:15:54.813 "write": true, 00:15:54.813 "unmap": true, 00:15:54.813 "flush": true, 00:15:54.813 "reset": true, 00:15:54.813 "nvme_admin": false, 00:15:54.813 "nvme_io": false, 00:15:54.813 "nvme_io_md": false, 00:15:54.813 "write_zeroes": true, 00:15:54.813 "zcopy": true, 00:15:54.813 "get_zone_info": false, 00:15:54.813 "zone_management": false, 00:15:54.813 "zone_append": false, 00:15:54.813 "compare": false, 00:15:54.813 "compare_and_write": false, 00:15:54.813 "abort": true, 00:15:54.813 "seek_hole": false, 00:15:54.813 "seek_data": false, 00:15:54.813 "copy": true, 00:15:54.813 "nvme_iov_md": false 00:15:54.813 }, 00:15:54.813 "memory_domains": [ 00:15:54.813 { 00:15:54.813 "dma_device_id": "system", 00:15:54.813 "dma_device_type": 1 00:15:54.813 }, 00:15:54.813 { 00:15:54.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.813 "dma_device_type": 2 00:15:54.813 } 00:15:54.813 ], 00:15:54.813 "driver_specific": {} 00:15:54.813 } 00:15:54.813 ] 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.813 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.072 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.072 "name": "Existed_Raid", 00:15:55.072 "uuid": "b48bd280-7a57-4347-ba51-f26cb504f890", 00:15:55.072 "strip_size_kb": 0, 00:15:55.072 "state": "online", 00:15:55.072 "raid_level": "raid1", 00:15:55.072 "superblock": false, 00:15:55.072 "num_base_bdevs": 3, 00:15:55.072 "num_base_bdevs_discovered": 3, 00:15:55.072 "num_base_bdevs_operational": 3, 00:15:55.072 "base_bdevs_list": [ 00:15:55.072 { 00:15:55.072 "name": "NewBaseBdev", 00:15:55.072 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:55.072 "is_configured": true, 00:15:55.072 "data_offset": 0, 00:15:55.072 "data_size": 65536 00:15:55.072 }, 00:15:55.072 { 00:15:55.072 "name": "BaseBdev2", 00:15:55.072 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:55.072 "is_configured": true, 00:15:55.072 "data_offset": 0, 00:15:55.072 "data_size": 65536 00:15:55.072 }, 00:15:55.072 { 00:15:55.072 "name": "BaseBdev3", 00:15:55.072 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:55.072 "is_configured": true, 00:15:55.072 "data_offset": 0, 00:15:55.073 "data_size": 65536 00:15:55.073 } 00:15:55.073 ] 00:15:55.073 }' 00:15:55.073 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.073 13:34:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:55.641 13:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:55.899 [2024-07-15 13:34:35.142188] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:55.899 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:55.899 "name": "Existed_Raid", 00:15:55.899 "aliases": [ 00:15:55.899 "b48bd280-7a57-4347-ba51-f26cb504f890" 00:15:55.899 ], 00:15:55.899 "product_name": "Raid Volume", 00:15:55.899 "block_size": 512, 00:15:55.899 "num_blocks": 65536, 00:15:55.899 "uuid": "b48bd280-7a57-4347-ba51-f26cb504f890", 00:15:55.899 "assigned_rate_limits": { 00:15:55.899 "rw_ios_per_sec": 0, 00:15:55.899 "rw_mbytes_per_sec": 0, 00:15:55.899 "r_mbytes_per_sec": 0, 00:15:55.899 "w_mbytes_per_sec": 0 00:15:55.899 }, 00:15:55.899 "claimed": false, 00:15:55.899 "zoned": false, 00:15:55.899 "supported_io_types": { 00:15:55.899 "read": true, 00:15:55.899 "write": true, 00:15:55.899 "unmap": false, 00:15:55.899 "flush": false, 00:15:55.899 "reset": true, 00:15:55.899 "nvme_admin": false, 00:15:55.899 "nvme_io": false, 00:15:55.899 "nvme_io_md": false, 00:15:55.899 "write_zeroes": true, 00:15:55.899 "zcopy": false, 00:15:55.899 "get_zone_info": false, 00:15:55.899 "zone_management": false, 00:15:55.899 "zone_append": false, 00:15:55.899 "compare": false, 00:15:55.899 "compare_and_write": false, 00:15:55.899 "abort": false, 00:15:55.899 "seek_hole": false, 00:15:55.899 "seek_data": false, 00:15:55.899 "copy": false, 00:15:55.899 "nvme_iov_md": false 00:15:55.899 }, 00:15:55.899 "memory_domains": [ 00:15:55.899 { 00:15:55.899 "dma_device_id": "system", 00:15:55.899 "dma_device_type": 1 00:15:55.899 }, 00:15:55.899 { 00:15:55.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.899 "dma_device_type": 2 00:15:55.899 }, 00:15:55.899 { 00:15:55.899 "dma_device_id": "system", 00:15:55.899 "dma_device_type": 1 00:15:55.899 }, 00:15:55.899 { 00:15:55.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.899 "dma_device_type": 2 00:15:55.899 }, 00:15:55.899 { 00:15:55.899 "dma_device_id": "system", 00:15:55.899 "dma_device_type": 1 00:15:55.899 }, 00:15:55.899 { 00:15:55.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.899 "dma_device_type": 2 00:15:55.899 } 00:15:55.899 ], 00:15:55.899 "driver_specific": { 00:15:55.899 "raid": { 00:15:55.899 "uuid": "b48bd280-7a57-4347-ba51-f26cb504f890", 00:15:55.899 "strip_size_kb": 0, 00:15:55.899 "state": "online", 00:15:55.899 "raid_level": "raid1", 00:15:55.899 "superblock": false, 00:15:55.899 "num_base_bdevs": 3, 00:15:55.899 "num_base_bdevs_discovered": 3, 00:15:55.899 "num_base_bdevs_operational": 3, 00:15:55.899 "base_bdevs_list": [ 00:15:55.900 { 00:15:55.900 "name": "NewBaseBdev", 00:15:55.900 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:55.900 "is_configured": true, 00:15:55.900 "data_offset": 0, 00:15:55.900 "data_size": 65536 00:15:55.900 }, 00:15:55.900 { 00:15:55.900 "name": "BaseBdev2", 00:15:55.900 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:55.900 "is_configured": true, 00:15:55.900 "data_offset": 0, 00:15:55.900 "data_size": 65536 00:15:55.900 }, 00:15:55.900 { 00:15:55.900 "name": "BaseBdev3", 00:15:55.900 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:55.900 "is_configured": true, 00:15:55.900 "data_offset": 0, 00:15:55.900 "data_size": 65536 00:15:55.900 } 00:15:55.900 ] 00:15:55.900 } 00:15:55.900 } 00:15:55.900 }' 00:15:55.900 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:55.900 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:55.900 BaseBdev2 00:15:55.900 BaseBdev3' 00:15:55.900 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:55.900 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:55.900 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:56.157 "name": "NewBaseBdev", 00:15:56.157 "aliases": [ 00:15:56.157 "ce591d75-e88b-4563-95c0-c0f214ecf8bb" 00:15:56.157 ], 00:15:56.157 "product_name": "Malloc disk", 00:15:56.157 "block_size": 512, 00:15:56.157 "num_blocks": 65536, 00:15:56.157 "uuid": "ce591d75-e88b-4563-95c0-c0f214ecf8bb", 00:15:56.157 "assigned_rate_limits": { 00:15:56.157 "rw_ios_per_sec": 0, 00:15:56.157 "rw_mbytes_per_sec": 0, 00:15:56.157 "r_mbytes_per_sec": 0, 00:15:56.157 "w_mbytes_per_sec": 0 00:15:56.157 }, 00:15:56.157 "claimed": true, 00:15:56.157 "claim_type": "exclusive_write", 00:15:56.157 "zoned": false, 00:15:56.157 "supported_io_types": { 00:15:56.157 "read": true, 00:15:56.157 "write": true, 00:15:56.157 "unmap": true, 00:15:56.157 "flush": true, 00:15:56.157 "reset": true, 00:15:56.157 "nvme_admin": false, 00:15:56.157 "nvme_io": false, 00:15:56.157 "nvme_io_md": false, 00:15:56.157 "write_zeroes": true, 00:15:56.157 "zcopy": true, 00:15:56.157 "get_zone_info": false, 00:15:56.157 "zone_management": false, 00:15:56.157 "zone_append": false, 00:15:56.157 "compare": false, 00:15:56.157 "compare_and_write": false, 00:15:56.157 "abort": true, 00:15:56.157 "seek_hole": false, 00:15:56.157 "seek_data": false, 00:15:56.157 "copy": true, 00:15:56.157 "nvme_iov_md": false 00:15:56.157 }, 00:15:56.157 "memory_domains": [ 00:15:56.157 { 00:15:56.157 "dma_device_id": "system", 00:15:56.157 "dma_device_type": 1 00:15:56.157 }, 00:15:56.157 { 00:15:56.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.157 "dma_device_type": 2 00:15:56.157 } 00:15:56.157 ], 00:15:56.157 "driver_specific": {} 00:15:56.157 }' 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.157 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:56.416 13:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:56.674 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:56.674 "name": "BaseBdev2", 00:15:56.674 "aliases": [ 00:15:56.674 "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6" 00:15:56.674 ], 00:15:56.674 "product_name": "Malloc disk", 00:15:56.674 "block_size": 512, 00:15:56.674 "num_blocks": 65536, 00:15:56.674 "uuid": "8b3cb821-6c2d-49da-80b6-fa5f97c0c5b6", 00:15:56.674 "assigned_rate_limits": { 00:15:56.674 "rw_ios_per_sec": 0, 00:15:56.674 "rw_mbytes_per_sec": 0, 00:15:56.674 "r_mbytes_per_sec": 0, 00:15:56.674 "w_mbytes_per_sec": 0 00:15:56.674 }, 00:15:56.674 "claimed": true, 00:15:56.674 "claim_type": "exclusive_write", 00:15:56.674 "zoned": false, 00:15:56.674 "supported_io_types": { 00:15:56.674 "read": true, 00:15:56.674 "write": true, 00:15:56.674 "unmap": true, 00:15:56.674 "flush": true, 00:15:56.674 "reset": true, 00:15:56.674 "nvme_admin": false, 00:15:56.674 "nvme_io": false, 00:15:56.674 "nvme_io_md": false, 00:15:56.674 "write_zeroes": true, 00:15:56.674 "zcopy": true, 00:15:56.674 "get_zone_info": false, 00:15:56.674 "zone_management": false, 00:15:56.674 "zone_append": false, 00:15:56.674 "compare": false, 00:15:56.674 "compare_and_write": false, 00:15:56.674 "abort": true, 00:15:56.674 "seek_hole": false, 00:15:56.674 "seek_data": false, 00:15:56.674 "copy": true, 00:15:56.674 "nvme_iov_md": false 00:15:56.674 }, 00:15:56.674 "memory_domains": [ 00:15:56.674 { 00:15:56.674 "dma_device_id": "system", 00:15:56.674 "dma_device_type": 1 00:15:56.674 }, 00:15:56.674 { 00:15:56.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.674 "dma_device_type": 2 00:15:56.675 } 00:15:56.675 ], 00:15:56.675 "driver_specific": {} 00:15:56.675 }' 00:15:56.675 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:56.933 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.192 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.192 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:57.192 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:57.192 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:57.192 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:57.451 "name": "BaseBdev3", 00:15:57.451 "aliases": [ 00:15:57.451 "3ab9cef3-dcdf-4783-aa65-26a3898371b2" 00:15:57.451 ], 00:15:57.451 "product_name": "Malloc disk", 00:15:57.451 "block_size": 512, 00:15:57.451 "num_blocks": 65536, 00:15:57.451 "uuid": "3ab9cef3-dcdf-4783-aa65-26a3898371b2", 00:15:57.451 "assigned_rate_limits": { 00:15:57.451 "rw_ios_per_sec": 0, 00:15:57.451 "rw_mbytes_per_sec": 0, 00:15:57.451 "r_mbytes_per_sec": 0, 00:15:57.451 "w_mbytes_per_sec": 0 00:15:57.451 }, 00:15:57.451 "claimed": true, 00:15:57.451 "claim_type": "exclusive_write", 00:15:57.451 "zoned": false, 00:15:57.451 "supported_io_types": { 00:15:57.451 "read": true, 00:15:57.451 "write": true, 00:15:57.451 "unmap": true, 00:15:57.451 "flush": true, 00:15:57.451 "reset": true, 00:15:57.451 "nvme_admin": false, 00:15:57.451 "nvme_io": false, 00:15:57.451 "nvme_io_md": false, 00:15:57.451 "write_zeroes": true, 00:15:57.451 "zcopy": true, 00:15:57.451 "get_zone_info": false, 00:15:57.451 "zone_management": false, 00:15:57.451 "zone_append": false, 00:15:57.451 "compare": false, 00:15:57.451 "compare_and_write": false, 00:15:57.451 "abort": true, 00:15:57.451 "seek_hole": false, 00:15:57.451 "seek_data": false, 00:15:57.451 "copy": true, 00:15:57.451 "nvme_iov_md": false 00:15:57.451 }, 00:15:57.451 "memory_domains": [ 00:15:57.451 { 00:15:57.451 "dma_device_id": "system", 00:15:57.451 "dma_device_type": 1 00:15:57.451 }, 00:15:57.451 { 00:15:57.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.451 "dma_device_type": 2 00:15:57.451 } 00:15:57.451 ], 00:15:57.451 "driver_specific": {} 00:15:57.451 }' 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:57.451 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:57.711 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:57.711 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:57.711 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.711 13:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.711 13:34:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:57.711 13:34:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:57.970 [2024-07-15 13:34:37.231533] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:57.970 [2024-07-15 13:34:37.231561] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:57.970 [2024-07-15 13:34:37.231620] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:57.970 [2024-07-15 13:34:37.231909] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:57.970 [2024-07-15 13:34:37.231921] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe6e40 name Existed_Raid, state offline 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2113650 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2113650 ']' 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2113650 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2113650 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2113650' 00:15:57.970 killing process with pid 2113650 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2113650 00:15:57.970 [2024-07-15 13:34:37.299237] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:57.970 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2113650 00:15:57.970 [2024-07-15 13:34:37.347902] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:58.537 00:15:58.537 real 0m30.701s 00:15:58.537 user 0m56.186s 00:15:58.537 sys 0m5.371s 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.537 ************************************ 00:15:58.537 END TEST raid_state_function_test 00:15:58.537 ************************************ 00:15:58.537 13:34:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:58.537 13:34:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:15:58.537 13:34:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:58.537 13:34:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:58.537 13:34:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:58.537 ************************************ 00:15:58.537 START TEST raid_state_function_test_sb 00:15:58.537 ************************************ 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2118281 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2118281' 00:15:58.537 Process raid pid: 2118281 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2118281 /var/tmp/spdk-raid.sock 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2118281 ']' 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:58.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:58.537 13:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.537 [2024-07-15 13:34:37.868951] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:15:58.537 [2024-07-15 13:34:37.869019] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:58.795 [2024-07-15 13:34:38.001089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:58.795 [2024-07-15 13:34:38.104633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:58.795 [2024-07-15 13:34:38.172047] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.795 [2024-07-15 13:34:38.172082] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:59.729 13:34:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:59.729 13:34:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:59.729 13:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:59.729 [2024-07-15 13:34:39.026850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:59.729 [2024-07-15 13:34:39.026894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:59.729 [2024-07-15 13:34:39.026906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:59.729 [2024-07-15 13:34:39.026923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:59.729 [2024-07-15 13:34:39.026937] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:59.729 [2024-07-15 13:34:39.026948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.729 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.988 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.988 "name": "Existed_Raid", 00:15:59.988 "uuid": "94ce8633-7f07-4c07-a74c-e1e31f08ac9e", 00:15:59.988 "strip_size_kb": 0, 00:15:59.988 "state": "configuring", 00:15:59.988 "raid_level": "raid1", 00:15:59.988 "superblock": true, 00:15:59.988 "num_base_bdevs": 3, 00:15:59.988 "num_base_bdevs_discovered": 0, 00:15:59.988 "num_base_bdevs_operational": 3, 00:15:59.988 "base_bdevs_list": [ 00:15:59.988 { 00:15:59.988 "name": "BaseBdev1", 00:15:59.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.988 "is_configured": false, 00:15:59.988 "data_offset": 0, 00:15:59.988 "data_size": 0 00:15:59.988 }, 00:15:59.988 { 00:15:59.988 "name": "BaseBdev2", 00:15:59.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.988 "is_configured": false, 00:15:59.988 "data_offset": 0, 00:15:59.988 "data_size": 0 00:15:59.988 }, 00:15:59.988 { 00:15:59.988 "name": "BaseBdev3", 00:15:59.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.988 "is_configured": false, 00:15:59.988 "data_offset": 0, 00:15:59.988 "data_size": 0 00:15:59.988 } 00:15:59.988 ] 00:15:59.988 }' 00:15:59.988 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.988 13:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.554 13:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:00.813 [2024-07-15 13:34:40.113588] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:00.813 [2024-07-15 13:34:40.113623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x174ca80 name Existed_Raid, state configuring 00:16:00.813 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:01.071 [2024-07-15 13:34:40.362270] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:01.071 [2024-07-15 13:34:40.362305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:01.071 [2024-07-15 13:34:40.362314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:01.071 [2024-07-15 13:34:40.362326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:01.071 [2024-07-15 13:34:40.362334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:01.071 [2024-07-15 13:34:40.362354] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:01.071 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:01.330 [2024-07-15 13:34:40.616933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.330 BaseBdev1 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:01.330 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.588 13:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.846 [ 00:16:01.846 { 00:16:01.846 "name": "BaseBdev1", 00:16:01.846 "aliases": [ 00:16:01.846 "a6dd0598-c2c4-432e-b539-708f5dec134e" 00:16:01.846 ], 00:16:01.846 "product_name": "Malloc disk", 00:16:01.846 "block_size": 512, 00:16:01.846 "num_blocks": 65536, 00:16:01.846 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:01.846 "assigned_rate_limits": { 00:16:01.846 "rw_ios_per_sec": 0, 00:16:01.846 "rw_mbytes_per_sec": 0, 00:16:01.846 "r_mbytes_per_sec": 0, 00:16:01.846 "w_mbytes_per_sec": 0 00:16:01.846 }, 00:16:01.846 "claimed": true, 00:16:01.846 "claim_type": "exclusive_write", 00:16:01.846 "zoned": false, 00:16:01.846 "supported_io_types": { 00:16:01.846 "read": true, 00:16:01.846 "write": true, 00:16:01.846 "unmap": true, 00:16:01.846 "flush": true, 00:16:01.846 "reset": true, 00:16:01.846 "nvme_admin": false, 00:16:01.846 "nvme_io": false, 00:16:01.846 "nvme_io_md": false, 00:16:01.846 "write_zeroes": true, 00:16:01.846 "zcopy": true, 00:16:01.846 "get_zone_info": false, 00:16:01.846 "zone_management": false, 00:16:01.846 "zone_append": false, 00:16:01.846 "compare": false, 00:16:01.846 "compare_and_write": false, 00:16:01.846 "abort": true, 00:16:01.846 "seek_hole": false, 00:16:01.846 "seek_data": false, 00:16:01.846 "copy": true, 00:16:01.846 "nvme_iov_md": false 00:16:01.846 }, 00:16:01.846 "memory_domains": [ 00:16:01.846 { 00:16:01.846 "dma_device_id": "system", 00:16:01.846 "dma_device_type": 1 00:16:01.846 }, 00:16:01.846 { 00:16:01.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.846 "dma_device_type": 2 00:16:01.846 } 00:16:01.846 ], 00:16:01.846 "driver_specific": {} 00:16:01.846 } 00:16:01.846 ] 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.846 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.105 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.105 "name": "Existed_Raid", 00:16:02.105 "uuid": "c4490db1-edda-4b40-973d-7c04fa345a02", 00:16:02.105 "strip_size_kb": 0, 00:16:02.105 "state": "configuring", 00:16:02.105 "raid_level": "raid1", 00:16:02.105 "superblock": true, 00:16:02.105 "num_base_bdevs": 3, 00:16:02.105 "num_base_bdevs_discovered": 1, 00:16:02.105 "num_base_bdevs_operational": 3, 00:16:02.105 "base_bdevs_list": [ 00:16:02.105 { 00:16:02.105 "name": "BaseBdev1", 00:16:02.105 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:02.105 "is_configured": true, 00:16:02.105 "data_offset": 2048, 00:16:02.105 "data_size": 63488 00:16:02.105 }, 00:16:02.105 { 00:16:02.105 "name": "BaseBdev2", 00:16:02.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.105 "is_configured": false, 00:16:02.105 "data_offset": 0, 00:16:02.105 "data_size": 0 00:16:02.105 }, 00:16:02.105 { 00:16:02.105 "name": "BaseBdev3", 00:16:02.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.105 "is_configured": false, 00:16:02.105 "data_offset": 0, 00:16:02.105 "data_size": 0 00:16:02.105 } 00:16:02.105 ] 00:16:02.105 }' 00:16:02.105 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.105 13:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.668 13:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:02.926 [2024-07-15 13:34:42.132943] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:02.926 [2024-07-15 13:34:42.132984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x174c310 name Existed_Raid, state configuring 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:02.926 [2024-07-15 13:34:42.305443] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:02.926 [2024-07-15 13:34:42.306876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:02.926 [2024-07-15 13:34:42.306911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:02.926 [2024-07-15 13:34:42.306921] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:02.926 [2024-07-15 13:34:42.306942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.926 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.184 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.184 "name": "Existed_Raid", 00:16:03.184 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:03.184 "strip_size_kb": 0, 00:16:03.184 "state": "configuring", 00:16:03.184 "raid_level": "raid1", 00:16:03.184 "superblock": true, 00:16:03.184 "num_base_bdevs": 3, 00:16:03.184 "num_base_bdevs_discovered": 1, 00:16:03.184 "num_base_bdevs_operational": 3, 00:16:03.184 "base_bdevs_list": [ 00:16:03.184 { 00:16:03.184 "name": "BaseBdev1", 00:16:03.184 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:03.184 "is_configured": true, 00:16:03.184 "data_offset": 2048, 00:16:03.184 "data_size": 63488 00:16:03.184 }, 00:16:03.184 { 00:16:03.184 "name": "BaseBdev2", 00:16:03.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.184 "is_configured": false, 00:16:03.184 "data_offset": 0, 00:16:03.184 "data_size": 0 00:16:03.184 }, 00:16:03.184 { 00:16:03.184 "name": "BaseBdev3", 00:16:03.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.184 "is_configured": false, 00:16:03.184 "data_offset": 0, 00:16:03.184 "data_size": 0 00:16:03.184 } 00:16:03.184 ] 00:16:03.184 }' 00:16:03.184 13:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.184 13:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.748 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:04.006 [2024-07-15 13:34:43.195219] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:04.006 BaseBdev2 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.006 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:04.264 [ 00:16:04.264 { 00:16:04.264 "name": "BaseBdev2", 00:16:04.264 "aliases": [ 00:16:04.264 "158438c6-2cd9-40dc-a978-e186e4c002f0" 00:16:04.264 ], 00:16:04.264 "product_name": "Malloc disk", 00:16:04.264 "block_size": 512, 00:16:04.264 "num_blocks": 65536, 00:16:04.264 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:04.264 "assigned_rate_limits": { 00:16:04.264 "rw_ios_per_sec": 0, 00:16:04.264 "rw_mbytes_per_sec": 0, 00:16:04.264 "r_mbytes_per_sec": 0, 00:16:04.264 "w_mbytes_per_sec": 0 00:16:04.264 }, 00:16:04.264 "claimed": true, 00:16:04.264 "claim_type": "exclusive_write", 00:16:04.264 "zoned": false, 00:16:04.264 "supported_io_types": { 00:16:04.264 "read": true, 00:16:04.264 "write": true, 00:16:04.264 "unmap": true, 00:16:04.264 "flush": true, 00:16:04.264 "reset": true, 00:16:04.264 "nvme_admin": false, 00:16:04.264 "nvme_io": false, 00:16:04.264 "nvme_io_md": false, 00:16:04.264 "write_zeroes": true, 00:16:04.264 "zcopy": true, 00:16:04.264 "get_zone_info": false, 00:16:04.264 "zone_management": false, 00:16:04.264 "zone_append": false, 00:16:04.264 "compare": false, 00:16:04.264 "compare_and_write": false, 00:16:04.264 "abort": true, 00:16:04.264 "seek_hole": false, 00:16:04.264 "seek_data": false, 00:16:04.264 "copy": true, 00:16:04.264 "nvme_iov_md": false 00:16:04.264 }, 00:16:04.264 "memory_domains": [ 00:16:04.264 { 00:16:04.264 "dma_device_id": "system", 00:16:04.264 "dma_device_type": 1 00:16:04.264 }, 00:16:04.264 { 00:16:04.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.264 "dma_device_type": 2 00:16:04.264 } 00:16:04.264 ], 00:16:04.264 "driver_specific": {} 00:16:04.264 } 00:16:04.264 ] 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.264 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.522 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.522 "name": "Existed_Raid", 00:16:04.522 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:04.522 "strip_size_kb": 0, 00:16:04.522 "state": "configuring", 00:16:04.522 "raid_level": "raid1", 00:16:04.522 "superblock": true, 00:16:04.522 "num_base_bdevs": 3, 00:16:04.522 "num_base_bdevs_discovered": 2, 00:16:04.522 "num_base_bdevs_operational": 3, 00:16:04.522 "base_bdevs_list": [ 00:16:04.522 { 00:16:04.522 "name": "BaseBdev1", 00:16:04.522 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:04.522 "is_configured": true, 00:16:04.522 "data_offset": 2048, 00:16:04.522 "data_size": 63488 00:16:04.522 }, 00:16:04.522 { 00:16:04.522 "name": "BaseBdev2", 00:16:04.522 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:04.522 "is_configured": true, 00:16:04.522 "data_offset": 2048, 00:16:04.522 "data_size": 63488 00:16:04.522 }, 00:16:04.522 { 00:16:04.522 "name": "BaseBdev3", 00:16:04.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.522 "is_configured": false, 00:16:04.522 "data_offset": 0, 00:16:04.522 "data_size": 0 00:16:04.522 } 00:16:04.522 ] 00:16:04.522 }' 00:16:04.522 13:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.522 13:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.086 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:05.344 [2024-07-15 13:34:44.562274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:05.344 [2024-07-15 13:34:44.562444] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x174d400 00:16:05.344 [2024-07-15 13:34:44.562459] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:05.344 [2024-07-15 13:34:44.562631] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x174cef0 00:16:05.344 [2024-07-15 13:34:44.562750] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x174d400 00:16:05.344 [2024-07-15 13:34:44.562761] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x174d400 00:16:05.344 [2024-07-15 13:34:44.562852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:05.344 BaseBdev3 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.344 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:05.602 [ 00:16:05.602 { 00:16:05.602 "name": "BaseBdev3", 00:16:05.602 "aliases": [ 00:16:05.602 "1f1f3d7d-c4b2-4772-906e-66556cb22a62" 00:16:05.602 ], 00:16:05.602 "product_name": "Malloc disk", 00:16:05.602 "block_size": 512, 00:16:05.602 "num_blocks": 65536, 00:16:05.602 "uuid": "1f1f3d7d-c4b2-4772-906e-66556cb22a62", 00:16:05.602 "assigned_rate_limits": { 00:16:05.602 "rw_ios_per_sec": 0, 00:16:05.602 "rw_mbytes_per_sec": 0, 00:16:05.602 "r_mbytes_per_sec": 0, 00:16:05.602 "w_mbytes_per_sec": 0 00:16:05.602 }, 00:16:05.602 "claimed": true, 00:16:05.602 "claim_type": "exclusive_write", 00:16:05.602 "zoned": false, 00:16:05.602 "supported_io_types": { 00:16:05.602 "read": true, 00:16:05.602 "write": true, 00:16:05.602 "unmap": true, 00:16:05.602 "flush": true, 00:16:05.602 "reset": true, 00:16:05.602 "nvme_admin": false, 00:16:05.602 "nvme_io": false, 00:16:05.602 "nvme_io_md": false, 00:16:05.602 "write_zeroes": true, 00:16:05.602 "zcopy": true, 00:16:05.602 "get_zone_info": false, 00:16:05.602 "zone_management": false, 00:16:05.602 "zone_append": false, 00:16:05.602 "compare": false, 00:16:05.602 "compare_and_write": false, 00:16:05.602 "abort": true, 00:16:05.602 "seek_hole": false, 00:16:05.602 "seek_data": false, 00:16:05.602 "copy": true, 00:16:05.602 "nvme_iov_md": false 00:16:05.602 }, 00:16:05.602 "memory_domains": [ 00:16:05.602 { 00:16:05.602 "dma_device_id": "system", 00:16:05.602 "dma_device_type": 1 00:16:05.602 }, 00:16:05.602 { 00:16:05.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.602 "dma_device_type": 2 00:16:05.602 } 00:16:05.602 ], 00:16:05.602 "driver_specific": {} 00:16:05.602 } 00:16:05.602 ] 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.602 13:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.890 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.890 "name": "Existed_Raid", 00:16:05.890 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:05.890 "strip_size_kb": 0, 00:16:05.890 "state": "online", 00:16:05.890 "raid_level": "raid1", 00:16:05.890 "superblock": true, 00:16:05.890 "num_base_bdevs": 3, 00:16:05.890 "num_base_bdevs_discovered": 3, 00:16:05.890 "num_base_bdevs_operational": 3, 00:16:05.890 "base_bdevs_list": [ 00:16:05.890 { 00:16:05.890 "name": "BaseBdev1", 00:16:05.890 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:05.890 "is_configured": true, 00:16:05.890 "data_offset": 2048, 00:16:05.890 "data_size": 63488 00:16:05.890 }, 00:16:05.890 { 00:16:05.890 "name": "BaseBdev2", 00:16:05.890 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:05.890 "is_configured": true, 00:16:05.890 "data_offset": 2048, 00:16:05.890 "data_size": 63488 00:16:05.890 }, 00:16:05.890 { 00:16:05.890 "name": "BaseBdev3", 00:16:05.890 "uuid": "1f1f3d7d-c4b2-4772-906e-66556cb22a62", 00:16:05.890 "is_configured": true, 00:16:05.890 "data_offset": 2048, 00:16:05.890 "data_size": 63488 00:16:05.890 } 00:16:05.890 ] 00:16:05.890 }' 00:16:05.890 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.890 13:34:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:06.457 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:06.715 [2024-07-15 13:34:45.946272] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:06.715 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:06.715 "name": "Existed_Raid", 00:16:06.715 "aliases": [ 00:16:06.715 "8379a683-d910-4f08-b65e-44e98590e8eb" 00:16:06.715 ], 00:16:06.715 "product_name": "Raid Volume", 00:16:06.715 "block_size": 512, 00:16:06.715 "num_blocks": 63488, 00:16:06.715 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:06.715 "assigned_rate_limits": { 00:16:06.715 "rw_ios_per_sec": 0, 00:16:06.715 "rw_mbytes_per_sec": 0, 00:16:06.715 "r_mbytes_per_sec": 0, 00:16:06.715 "w_mbytes_per_sec": 0 00:16:06.715 }, 00:16:06.715 "claimed": false, 00:16:06.715 "zoned": false, 00:16:06.715 "supported_io_types": { 00:16:06.715 "read": true, 00:16:06.715 "write": true, 00:16:06.715 "unmap": false, 00:16:06.715 "flush": false, 00:16:06.715 "reset": true, 00:16:06.715 "nvme_admin": false, 00:16:06.715 "nvme_io": false, 00:16:06.715 "nvme_io_md": false, 00:16:06.715 "write_zeroes": true, 00:16:06.715 "zcopy": false, 00:16:06.715 "get_zone_info": false, 00:16:06.715 "zone_management": false, 00:16:06.715 "zone_append": false, 00:16:06.715 "compare": false, 00:16:06.715 "compare_and_write": false, 00:16:06.715 "abort": false, 00:16:06.715 "seek_hole": false, 00:16:06.715 "seek_data": false, 00:16:06.715 "copy": false, 00:16:06.715 "nvme_iov_md": false 00:16:06.715 }, 00:16:06.715 "memory_domains": [ 00:16:06.715 { 00:16:06.715 "dma_device_id": "system", 00:16:06.715 "dma_device_type": 1 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.715 "dma_device_type": 2 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "dma_device_id": "system", 00:16:06.715 "dma_device_type": 1 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.715 "dma_device_type": 2 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "dma_device_id": "system", 00:16:06.715 "dma_device_type": 1 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.715 "dma_device_type": 2 00:16:06.715 } 00:16:06.715 ], 00:16:06.715 "driver_specific": { 00:16:06.715 "raid": { 00:16:06.715 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:06.715 "strip_size_kb": 0, 00:16:06.715 "state": "online", 00:16:06.715 "raid_level": "raid1", 00:16:06.715 "superblock": true, 00:16:06.715 "num_base_bdevs": 3, 00:16:06.715 "num_base_bdevs_discovered": 3, 00:16:06.715 "num_base_bdevs_operational": 3, 00:16:06.715 "base_bdevs_list": [ 00:16:06.715 { 00:16:06.715 "name": "BaseBdev1", 00:16:06.715 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:06.715 "is_configured": true, 00:16:06.715 "data_offset": 2048, 00:16:06.715 "data_size": 63488 00:16:06.715 }, 00:16:06.715 { 00:16:06.715 "name": "BaseBdev2", 00:16:06.715 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:06.715 "is_configured": true, 00:16:06.716 "data_offset": 2048, 00:16:06.716 "data_size": 63488 00:16:06.716 }, 00:16:06.716 { 00:16:06.716 "name": "BaseBdev3", 00:16:06.716 "uuid": "1f1f3d7d-c4b2-4772-906e-66556cb22a62", 00:16:06.716 "is_configured": true, 00:16:06.716 "data_offset": 2048, 00:16:06.716 "data_size": 63488 00:16:06.716 } 00:16:06.716 ] 00:16:06.716 } 00:16:06.716 } 00:16:06.716 }' 00:16:06.716 13:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:06.716 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:06.716 BaseBdev2 00:16:06.716 BaseBdev3' 00:16:06.716 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.716 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:06.716 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.974 "name": "BaseBdev1", 00:16:06.974 "aliases": [ 00:16:06.974 "a6dd0598-c2c4-432e-b539-708f5dec134e" 00:16:06.974 ], 00:16:06.974 "product_name": "Malloc disk", 00:16:06.974 "block_size": 512, 00:16:06.974 "num_blocks": 65536, 00:16:06.974 "uuid": "a6dd0598-c2c4-432e-b539-708f5dec134e", 00:16:06.974 "assigned_rate_limits": { 00:16:06.974 "rw_ios_per_sec": 0, 00:16:06.974 "rw_mbytes_per_sec": 0, 00:16:06.974 "r_mbytes_per_sec": 0, 00:16:06.974 "w_mbytes_per_sec": 0 00:16:06.974 }, 00:16:06.974 "claimed": true, 00:16:06.974 "claim_type": "exclusive_write", 00:16:06.974 "zoned": false, 00:16:06.974 "supported_io_types": { 00:16:06.974 "read": true, 00:16:06.974 "write": true, 00:16:06.974 "unmap": true, 00:16:06.974 "flush": true, 00:16:06.974 "reset": true, 00:16:06.974 "nvme_admin": false, 00:16:06.974 "nvme_io": false, 00:16:06.974 "nvme_io_md": false, 00:16:06.974 "write_zeroes": true, 00:16:06.974 "zcopy": true, 00:16:06.974 "get_zone_info": false, 00:16:06.974 "zone_management": false, 00:16:06.974 "zone_append": false, 00:16:06.974 "compare": false, 00:16:06.974 "compare_and_write": false, 00:16:06.974 "abort": true, 00:16:06.974 "seek_hole": false, 00:16:06.974 "seek_data": false, 00:16:06.974 "copy": true, 00:16:06.974 "nvme_iov_md": false 00:16:06.974 }, 00:16:06.974 "memory_domains": [ 00:16:06.974 { 00:16:06.974 "dma_device_id": "system", 00:16:06.974 "dma_device_type": 1 00:16:06.974 }, 00:16:06.974 { 00:16:06.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.974 "dma_device_type": 2 00:16:06.974 } 00:16:06.974 ], 00:16:06.974 "driver_specific": {} 00:16:06.974 }' 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.974 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.232 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:07.490 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.490 "name": "BaseBdev2", 00:16:07.490 "aliases": [ 00:16:07.490 "158438c6-2cd9-40dc-a978-e186e4c002f0" 00:16:07.490 ], 00:16:07.490 "product_name": "Malloc disk", 00:16:07.490 "block_size": 512, 00:16:07.490 "num_blocks": 65536, 00:16:07.490 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:07.490 "assigned_rate_limits": { 00:16:07.490 "rw_ios_per_sec": 0, 00:16:07.490 "rw_mbytes_per_sec": 0, 00:16:07.490 "r_mbytes_per_sec": 0, 00:16:07.490 "w_mbytes_per_sec": 0 00:16:07.490 }, 00:16:07.490 "claimed": true, 00:16:07.490 "claim_type": "exclusive_write", 00:16:07.490 "zoned": false, 00:16:07.490 "supported_io_types": { 00:16:07.490 "read": true, 00:16:07.490 "write": true, 00:16:07.490 "unmap": true, 00:16:07.490 "flush": true, 00:16:07.490 "reset": true, 00:16:07.490 "nvme_admin": false, 00:16:07.490 "nvme_io": false, 00:16:07.490 "nvme_io_md": false, 00:16:07.490 "write_zeroes": true, 00:16:07.490 "zcopy": true, 00:16:07.490 "get_zone_info": false, 00:16:07.490 "zone_management": false, 00:16:07.490 "zone_append": false, 00:16:07.490 "compare": false, 00:16:07.490 "compare_and_write": false, 00:16:07.490 "abort": true, 00:16:07.490 "seek_hole": false, 00:16:07.490 "seek_data": false, 00:16:07.490 "copy": true, 00:16:07.490 "nvme_iov_md": false 00:16:07.490 }, 00:16:07.490 "memory_domains": [ 00:16:07.490 { 00:16:07.490 "dma_device_id": "system", 00:16:07.490 "dma_device_type": 1 00:16:07.490 }, 00:16:07.490 { 00:16:07.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.490 "dma_device_type": 2 00:16:07.490 } 00:16:07.490 ], 00:16:07.490 "driver_specific": {} 00:16:07.490 }' 00:16:07.490 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.748 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.748 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.748 13:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.748 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.005 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.005 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.005 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:08.005 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:08.005 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:08.264 "name": "BaseBdev3", 00:16:08.264 "aliases": [ 00:16:08.264 "1f1f3d7d-c4b2-4772-906e-66556cb22a62" 00:16:08.264 ], 00:16:08.264 "product_name": "Malloc disk", 00:16:08.264 "block_size": 512, 00:16:08.264 "num_blocks": 65536, 00:16:08.264 "uuid": "1f1f3d7d-c4b2-4772-906e-66556cb22a62", 00:16:08.264 "assigned_rate_limits": { 00:16:08.264 "rw_ios_per_sec": 0, 00:16:08.264 "rw_mbytes_per_sec": 0, 00:16:08.264 "r_mbytes_per_sec": 0, 00:16:08.264 "w_mbytes_per_sec": 0 00:16:08.264 }, 00:16:08.264 "claimed": true, 00:16:08.264 "claim_type": "exclusive_write", 00:16:08.264 "zoned": false, 00:16:08.264 "supported_io_types": { 00:16:08.264 "read": true, 00:16:08.264 "write": true, 00:16:08.264 "unmap": true, 00:16:08.264 "flush": true, 00:16:08.264 "reset": true, 00:16:08.264 "nvme_admin": false, 00:16:08.264 "nvme_io": false, 00:16:08.264 "nvme_io_md": false, 00:16:08.264 "write_zeroes": true, 00:16:08.264 "zcopy": true, 00:16:08.264 "get_zone_info": false, 00:16:08.264 "zone_management": false, 00:16:08.264 "zone_append": false, 00:16:08.264 "compare": false, 00:16:08.264 "compare_and_write": false, 00:16:08.264 "abort": true, 00:16:08.264 "seek_hole": false, 00:16:08.264 "seek_data": false, 00:16:08.264 "copy": true, 00:16:08.264 "nvme_iov_md": false 00:16:08.264 }, 00:16:08.264 "memory_domains": [ 00:16:08.264 { 00:16:08.264 "dma_device_id": "system", 00:16:08.264 "dma_device_type": 1 00:16:08.264 }, 00:16:08.264 { 00:16:08.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.264 "dma_device_type": 2 00:16:08.264 } 00:16:08.264 ], 00:16:08.264 "driver_specific": {} 00:16:08.264 }' 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.264 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.522 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.522 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.522 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.522 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.522 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.781 [2024-07-15 13:34:47.971420] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.781 13:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.040 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.040 "name": "Existed_Raid", 00:16:09.040 "uuid": "8379a683-d910-4f08-b65e-44e98590e8eb", 00:16:09.040 "strip_size_kb": 0, 00:16:09.040 "state": "online", 00:16:09.040 "raid_level": "raid1", 00:16:09.040 "superblock": true, 00:16:09.040 "num_base_bdevs": 3, 00:16:09.040 "num_base_bdevs_discovered": 2, 00:16:09.040 "num_base_bdevs_operational": 2, 00:16:09.040 "base_bdevs_list": [ 00:16:09.040 { 00:16:09.040 "name": null, 00:16:09.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.040 "is_configured": false, 00:16:09.040 "data_offset": 2048, 00:16:09.040 "data_size": 63488 00:16:09.040 }, 00:16:09.040 { 00:16:09.040 "name": "BaseBdev2", 00:16:09.040 "uuid": "158438c6-2cd9-40dc-a978-e186e4c002f0", 00:16:09.040 "is_configured": true, 00:16:09.040 "data_offset": 2048, 00:16:09.040 "data_size": 63488 00:16:09.040 }, 00:16:09.040 { 00:16:09.040 "name": "BaseBdev3", 00:16:09.040 "uuid": "1f1f3d7d-c4b2-4772-906e-66556cb22a62", 00:16:09.040 "is_configured": true, 00:16:09.040 "data_offset": 2048, 00:16:09.040 "data_size": 63488 00:16:09.040 } 00:16:09.040 ] 00:16:09.040 }' 00:16:09.040 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.040 13:34:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.606 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:09.607 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.607 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.607 13:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:09.866 [2024-07-15 13:34:49.243960] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.866 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:10.125 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:10.125 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:10.125 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:10.384 [2024-07-15 13:34:49.751672] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.384 [2024-07-15 13:34:49.751767] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:10.384 [2024-07-15 13:34:49.762642] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:10.384 [2024-07-15 13:34:49.762680] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:10.384 [2024-07-15 13:34:49.762691] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x174d400 name Existed_Raid, state offline 00:16:10.384 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.384 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.384 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.384 13:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.642 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:10.901 BaseBdev2 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:10.901 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.160 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:11.419 [ 00:16:11.419 { 00:16:11.419 "name": "BaseBdev2", 00:16:11.419 "aliases": [ 00:16:11.419 "a8886faa-d2e7-4206-bf2b-8b0690e37bca" 00:16:11.419 ], 00:16:11.419 "product_name": "Malloc disk", 00:16:11.419 "block_size": 512, 00:16:11.419 "num_blocks": 65536, 00:16:11.419 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:11.419 "assigned_rate_limits": { 00:16:11.419 "rw_ios_per_sec": 0, 00:16:11.419 "rw_mbytes_per_sec": 0, 00:16:11.419 "r_mbytes_per_sec": 0, 00:16:11.419 "w_mbytes_per_sec": 0 00:16:11.419 }, 00:16:11.419 "claimed": false, 00:16:11.419 "zoned": false, 00:16:11.419 "supported_io_types": { 00:16:11.419 "read": true, 00:16:11.419 "write": true, 00:16:11.419 "unmap": true, 00:16:11.419 "flush": true, 00:16:11.419 "reset": true, 00:16:11.419 "nvme_admin": false, 00:16:11.419 "nvme_io": false, 00:16:11.419 "nvme_io_md": false, 00:16:11.419 "write_zeroes": true, 00:16:11.419 "zcopy": true, 00:16:11.419 "get_zone_info": false, 00:16:11.419 "zone_management": false, 00:16:11.419 "zone_append": false, 00:16:11.419 "compare": false, 00:16:11.419 "compare_and_write": false, 00:16:11.419 "abort": true, 00:16:11.419 "seek_hole": false, 00:16:11.419 "seek_data": false, 00:16:11.419 "copy": true, 00:16:11.419 "nvme_iov_md": false 00:16:11.419 }, 00:16:11.419 "memory_domains": [ 00:16:11.419 { 00:16:11.419 "dma_device_id": "system", 00:16:11.419 "dma_device_type": 1 00:16:11.419 }, 00:16:11.419 { 00:16:11.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.419 "dma_device_type": 2 00:16:11.419 } 00:16:11.419 ], 00:16:11.419 "driver_specific": {} 00:16:11.419 } 00:16:11.419 ] 00:16:11.419 13:34:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:11.419 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.419 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.419 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:11.678 BaseBdev3 00:16:11.678 13:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:11.678 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.937 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:12.195 [ 00:16:12.195 { 00:16:12.195 "name": "BaseBdev3", 00:16:12.195 "aliases": [ 00:16:12.195 "ae3b37f0-2e94-41a7-8f1e-8a639e864d13" 00:16:12.195 ], 00:16:12.195 "product_name": "Malloc disk", 00:16:12.195 "block_size": 512, 00:16:12.195 "num_blocks": 65536, 00:16:12.195 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:12.195 "assigned_rate_limits": { 00:16:12.195 "rw_ios_per_sec": 0, 00:16:12.195 "rw_mbytes_per_sec": 0, 00:16:12.195 "r_mbytes_per_sec": 0, 00:16:12.195 "w_mbytes_per_sec": 0 00:16:12.195 }, 00:16:12.195 "claimed": false, 00:16:12.195 "zoned": false, 00:16:12.195 "supported_io_types": { 00:16:12.195 "read": true, 00:16:12.195 "write": true, 00:16:12.195 "unmap": true, 00:16:12.195 "flush": true, 00:16:12.195 "reset": true, 00:16:12.195 "nvme_admin": false, 00:16:12.195 "nvme_io": false, 00:16:12.195 "nvme_io_md": false, 00:16:12.195 "write_zeroes": true, 00:16:12.195 "zcopy": true, 00:16:12.195 "get_zone_info": false, 00:16:12.195 "zone_management": false, 00:16:12.195 "zone_append": false, 00:16:12.195 "compare": false, 00:16:12.195 "compare_and_write": false, 00:16:12.195 "abort": true, 00:16:12.195 "seek_hole": false, 00:16:12.195 "seek_data": false, 00:16:12.195 "copy": true, 00:16:12.195 "nvme_iov_md": false 00:16:12.195 }, 00:16:12.195 "memory_domains": [ 00:16:12.195 { 00:16:12.195 "dma_device_id": "system", 00:16:12.195 "dma_device_type": 1 00:16:12.195 }, 00:16:12.195 { 00:16:12.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.195 "dma_device_type": 2 00:16:12.195 } 00:16:12.195 ], 00:16:12.195 "driver_specific": {} 00:16:12.195 } 00:16:12.195 ] 00:16:12.195 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:12.195 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:12.195 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:12.195 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:12.454 [2024-07-15 13:34:51.717701] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:12.454 [2024-07-15 13:34:51.717750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:12.454 [2024-07-15 13:34:51.717769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.454 [2024-07-15 13:34:51.719115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.454 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.712 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.712 "name": "Existed_Raid", 00:16:12.712 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:12.712 "strip_size_kb": 0, 00:16:12.712 "state": "configuring", 00:16:12.712 "raid_level": "raid1", 00:16:12.712 "superblock": true, 00:16:12.712 "num_base_bdevs": 3, 00:16:12.712 "num_base_bdevs_discovered": 2, 00:16:12.712 "num_base_bdevs_operational": 3, 00:16:12.712 "base_bdevs_list": [ 00:16:12.712 { 00:16:12.712 "name": "BaseBdev1", 00:16:12.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.712 "is_configured": false, 00:16:12.712 "data_offset": 0, 00:16:12.712 "data_size": 0 00:16:12.712 }, 00:16:12.712 { 00:16:12.712 "name": "BaseBdev2", 00:16:12.712 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:12.712 "is_configured": true, 00:16:12.712 "data_offset": 2048, 00:16:12.712 "data_size": 63488 00:16:12.712 }, 00:16:12.712 { 00:16:12.712 "name": "BaseBdev3", 00:16:12.712 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:12.712 "is_configured": true, 00:16:12.713 "data_offset": 2048, 00:16:12.713 "data_size": 63488 00:16:12.713 } 00:16:12.713 ] 00:16:12.713 }' 00:16:12.713 13:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.713 13:34:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.276 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:13.534 [2024-07-15 13:34:52.804566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.534 13:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.792 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.792 "name": "Existed_Raid", 00:16:13.792 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:13.792 "strip_size_kb": 0, 00:16:13.792 "state": "configuring", 00:16:13.792 "raid_level": "raid1", 00:16:13.792 "superblock": true, 00:16:13.792 "num_base_bdevs": 3, 00:16:13.792 "num_base_bdevs_discovered": 1, 00:16:13.792 "num_base_bdevs_operational": 3, 00:16:13.792 "base_bdevs_list": [ 00:16:13.792 { 00:16:13.792 "name": "BaseBdev1", 00:16:13.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.792 "is_configured": false, 00:16:13.792 "data_offset": 0, 00:16:13.792 "data_size": 0 00:16:13.792 }, 00:16:13.792 { 00:16:13.792 "name": null, 00:16:13.792 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:13.792 "is_configured": false, 00:16:13.792 "data_offset": 2048, 00:16:13.792 "data_size": 63488 00:16:13.792 }, 00:16:13.792 { 00:16:13.792 "name": "BaseBdev3", 00:16:13.792 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:13.792 "is_configured": true, 00:16:13.792 "data_offset": 2048, 00:16:13.792 "data_size": 63488 00:16:13.792 } 00:16:13.792 ] 00:16:13.792 }' 00:16:13.792 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.792 13:34:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.357 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:14.357 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.615 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:14.615 13:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:14.873 [2024-07-15 13:34:54.148789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:14.873 BaseBdev1 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:14.873 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.131 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:15.389 [ 00:16:15.389 { 00:16:15.389 "name": "BaseBdev1", 00:16:15.389 "aliases": [ 00:16:15.389 "ff0727fc-34b0-4e9e-bf07-bb96d0c36628" 00:16:15.389 ], 00:16:15.389 "product_name": "Malloc disk", 00:16:15.389 "block_size": 512, 00:16:15.389 "num_blocks": 65536, 00:16:15.389 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:15.389 "assigned_rate_limits": { 00:16:15.389 "rw_ios_per_sec": 0, 00:16:15.389 "rw_mbytes_per_sec": 0, 00:16:15.389 "r_mbytes_per_sec": 0, 00:16:15.389 "w_mbytes_per_sec": 0 00:16:15.389 }, 00:16:15.389 "claimed": true, 00:16:15.389 "claim_type": "exclusive_write", 00:16:15.389 "zoned": false, 00:16:15.389 "supported_io_types": { 00:16:15.389 "read": true, 00:16:15.389 "write": true, 00:16:15.389 "unmap": true, 00:16:15.389 "flush": true, 00:16:15.389 "reset": true, 00:16:15.389 "nvme_admin": false, 00:16:15.389 "nvme_io": false, 00:16:15.389 "nvme_io_md": false, 00:16:15.389 "write_zeroes": true, 00:16:15.389 "zcopy": true, 00:16:15.389 "get_zone_info": false, 00:16:15.389 "zone_management": false, 00:16:15.389 "zone_append": false, 00:16:15.389 "compare": false, 00:16:15.389 "compare_and_write": false, 00:16:15.389 "abort": true, 00:16:15.389 "seek_hole": false, 00:16:15.389 "seek_data": false, 00:16:15.389 "copy": true, 00:16:15.389 "nvme_iov_md": false 00:16:15.389 }, 00:16:15.389 "memory_domains": [ 00:16:15.389 { 00:16:15.389 "dma_device_id": "system", 00:16:15.389 "dma_device_type": 1 00:16:15.389 }, 00:16:15.389 { 00:16:15.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.389 "dma_device_type": 2 00:16:15.389 } 00:16:15.389 ], 00:16:15.389 "driver_specific": {} 00:16:15.389 } 00:16:15.389 ] 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.389 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:15.648 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.648 "name": "Existed_Raid", 00:16:15.648 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:15.648 "strip_size_kb": 0, 00:16:15.648 "state": "configuring", 00:16:15.648 "raid_level": "raid1", 00:16:15.648 "superblock": true, 00:16:15.648 "num_base_bdevs": 3, 00:16:15.648 "num_base_bdevs_discovered": 2, 00:16:15.648 "num_base_bdevs_operational": 3, 00:16:15.648 "base_bdevs_list": [ 00:16:15.648 { 00:16:15.648 "name": "BaseBdev1", 00:16:15.648 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:15.648 "is_configured": true, 00:16:15.648 "data_offset": 2048, 00:16:15.648 "data_size": 63488 00:16:15.648 }, 00:16:15.648 { 00:16:15.648 "name": null, 00:16:15.649 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:15.649 "is_configured": false, 00:16:15.649 "data_offset": 2048, 00:16:15.649 "data_size": 63488 00:16:15.649 }, 00:16:15.649 { 00:16:15.649 "name": "BaseBdev3", 00:16:15.649 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:15.649 "is_configured": true, 00:16:15.649 "data_offset": 2048, 00:16:15.649 "data_size": 63488 00:16:15.649 } 00:16:15.649 ] 00:16:15.649 }' 00:16:15.649 13:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.649 13:34:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.216 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:16.216 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.475 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:16.475 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:16.734 [2024-07-15 13:34:55.961638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.734 13:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.993 13:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.993 "name": "Existed_Raid", 00:16:16.993 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:16.993 "strip_size_kb": 0, 00:16:16.993 "state": "configuring", 00:16:16.993 "raid_level": "raid1", 00:16:16.993 "superblock": true, 00:16:16.993 "num_base_bdevs": 3, 00:16:16.993 "num_base_bdevs_discovered": 1, 00:16:16.993 "num_base_bdevs_operational": 3, 00:16:16.993 "base_bdevs_list": [ 00:16:16.993 { 00:16:16.993 "name": "BaseBdev1", 00:16:16.993 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:16.993 "is_configured": true, 00:16:16.993 "data_offset": 2048, 00:16:16.993 "data_size": 63488 00:16:16.993 }, 00:16:16.993 { 00:16:16.993 "name": null, 00:16:16.993 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:16.993 "is_configured": false, 00:16:16.993 "data_offset": 2048, 00:16:16.993 "data_size": 63488 00:16:16.993 }, 00:16:16.993 { 00:16:16.993 "name": null, 00:16:16.993 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:16.993 "is_configured": false, 00:16:16.993 "data_offset": 2048, 00:16:16.993 "data_size": 63488 00:16:16.993 } 00:16:16.993 ] 00:16:16.993 }' 00:16:16.993 13:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.993 13:34:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:17.560 13:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.560 13:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:17.818 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:17.818 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:18.077 [2024-07-15 13:34:57.277147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.077 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.335 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.335 "name": "Existed_Raid", 00:16:18.335 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:18.335 "strip_size_kb": 0, 00:16:18.335 "state": "configuring", 00:16:18.335 "raid_level": "raid1", 00:16:18.335 "superblock": true, 00:16:18.335 "num_base_bdevs": 3, 00:16:18.335 "num_base_bdevs_discovered": 2, 00:16:18.335 "num_base_bdevs_operational": 3, 00:16:18.335 "base_bdevs_list": [ 00:16:18.335 { 00:16:18.335 "name": "BaseBdev1", 00:16:18.335 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:18.335 "is_configured": true, 00:16:18.335 "data_offset": 2048, 00:16:18.335 "data_size": 63488 00:16:18.335 }, 00:16:18.335 { 00:16:18.335 "name": null, 00:16:18.335 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:18.335 "is_configured": false, 00:16:18.335 "data_offset": 2048, 00:16:18.335 "data_size": 63488 00:16:18.335 }, 00:16:18.335 { 00:16:18.335 "name": "BaseBdev3", 00:16:18.335 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:18.335 "is_configured": true, 00:16:18.335 "data_offset": 2048, 00:16:18.335 "data_size": 63488 00:16:18.335 } 00:16:18.335 ] 00:16:18.335 }' 00:16:18.335 13:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.335 13:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:18.902 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.902 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:19.161 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:19.161 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:19.420 [2024-07-15 13:34:58.604695] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.420 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.679 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.679 "name": "Existed_Raid", 00:16:19.679 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:19.679 "strip_size_kb": 0, 00:16:19.679 "state": "configuring", 00:16:19.679 "raid_level": "raid1", 00:16:19.679 "superblock": true, 00:16:19.679 "num_base_bdevs": 3, 00:16:19.679 "num_base_bdevs_discovered": 1, 00:16:19.679 "num_base_bdevs_operational": 3, 00:16:19.679 "base_bdevs_list": [ 00:16:19.679 { 00:16:19.679 "name": null, 00:16:19.679 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:19.679 "is_configured": false, 00:16:19.679 "data_offset": 2048, 00:16:19.679 "data_size": 63488 00:16:19.679 }, 00:16:19.679 { 00:16:19.679 "name": null, 00:16:19.679 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:19.679 "is_configured": false, 00:16:19.679 "data_offset": 2048, 00:16:19.679 "data_size": 63488 00:16:19.679 }, 00:16:19.679 { 00:16:19.679 "name": "BaseBdev3", 00:16:19.679 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:19.679 "is_configured": true, 00:16:19.679 "data_offset": 2048, 00:16:19.679 "data_size": 63488 00:16:19.679 } 00:16:19.679 ] 00:16:19.679 }' 00:16:19.679 13:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.679 13:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:20.284 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.284 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:20.543 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:20.543 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:20.802 [2024-07-15 13:34:59.972340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.802 13:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:21.061 13:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.061 "name": "Existed_Raid", 00:16:21.061 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:21.061 "strip_size_kb": 0, 00:16:21.061 "state": "configuring", 00:16:21.061 "raid_level": "raid1", 00:16:21.061 "superblock": true, 00:16:21.061 "num_base_bdevs": 3, 00:16:21.061 "num_base_bdevs_discovered": 2, 00:16:21.061 "num_base_bdevs_operational": 3, 00:16:21.061 "base_bdevs_list": [ 00:16:21.061 { 00:16:21.061 "name": null, 00:16:21.061 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:21.061 "is_configured": false, 00:16:21.062 "data_offset": 2048, 00:16:21.062 "data_size": 63488 00:16:21.062 }, 00:16:21.062 { 00:16:21.062 "name": "BaseBdev2", 00:16:21.062 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:21.062 "is_configured": true, 00:16:21.062 "data_offset": 2048, 00:16:21.062 "data_size": 63488 00:16:21.062 }, 00:16:21.062 { 00:16:21.062 "name": "BaseBdev3", 00:16:21.062 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:21.062 "is_configured": true, 00:16:21.062 "data_offset": 2048, 00:16:21.062 "data_size": 63488 00:16:21.062 } 00:16:21.062 ] 00:16:21.062 }' 00:16:21.062 13:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.062 13:35:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:21.630 13:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.630 13:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:21.630 13:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:21.630 13:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.630 13:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:21.888 13:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ff0727fc-34b0-4e9e-bf07-bb96d0c36628 00:16:22.147 [2024-07-15 13:35:01.480956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:22.147 [2024-07-15 13:35:01.481127] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17431b0 00:16:22.147 [2024-07-15 13:35:01.481140] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:22.147 [2024-07-15 13:35:01.481318] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ff4f0 00:16:22.147 [2024-07-15 13:35:01.481440] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17431b0 00:16:22.147 [2024-07-15 13:35:01.481450] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17431b0 00:16:22.147 [2024-07-15 13:35:01.481544] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.147 NewBaseBdev 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.147 13:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.714 13:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:22.973 [ 00:16:22.973 { 00:16:22.973 "name": "NewBaseBdev", 00:16:22.973 "aliases": [ 00:16:22.973 "ff0727fc-34b0-4e9e-bf07-bb96d0c36628" 00:16:22.973 ], 00:16:22.973 "product_name": "Malloc disk", 00:16:22.973 "block_size": 512, 00:16:22.973 "num_blocks": 65536, 00:16:22.973 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:22.973 "assigned_rate_limits": { 00:16:22.973 "rw_ios_per_sec": 0, 00:16:22.973 "rw_mbytes_per_sec": 0, 00:16:22.973 "r_mbytes_per_sec": 0, 00:16:22.973 "w_mbytes_per_sec": 0 00:16:22.973 }, 00:16:22.973 "claimed": true, 00:16:22.973 "claim_type": "exclusive_write", 00:16:22.973 "zoned": false, 00:16:22.973 "supported_io_types": { 00:16:22.973 "read": true, 00:16:22.973 "write": true, 00:16:22.973 "unmap": true, 00:16:22.973 "flush": true, 00:16:22.973 "reset": true, 00:16:22.973 "nvme_admin": false, 00:16:22.973 "nvme_io": false, 00:16:22.973 "nvme_io_md": false, 00:16:22.973 "write_zeroes": true, 00:16:22.973 "zcopy": true, 00:16:22.973 "get_zone_info": false, 00:16:22.973 "zone_management": false, 00:16:22.973 "zone_append": false, 00:16:22.973 "compare": false, 00:16:22.973 "compare_and_write": false, 00:16:22.973 "abort": true, 00:16:22.973 "seek_hole": false, 00:16:22.973 "seek_data": false, 00:16:22.973 "copy": true, 00:16:22.973 "nvme_iov_md": false 00:16:22.973 }, 00:16:22.973 "memory_domains": [ 00:16:22.973 { 00:16:22.973 "dma_device_id": "system", 00:16:22.973 "dma_device_type": 1 00:16:22.973 }, 00:16:22.973 { 00:16:22.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.973 "dma_device_type": 2 00:16:22.973 } 00:16:22.973 ], 00:16:22.973 "driver_specific": {} 00:16:22.973 } 00:16:22.973 ] 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.973 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.232 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.232 "name": "Existed_Raid", 00:16:23.232 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:23.232 "strip_size_kb": 0, 00:16:23.232 "state": "online", 00:16:23.232 "raid_level": "raid1", 00:16:23.232 "superblock": true, 00:16:23.232 "num_base_bdevs": 3, 00:16:23.232 "num_base_bdevs_discovered": 3, 00:16:23.232 "num_base_bdevs_operational": 3, 00:16:23.232 "base_bdevs_list": [ 00:16:23.232 { 00:16:23.232 "name": "NewBaseBdev", 00:16:23.232 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:23.232 "is_configured": true, 00:16:23.232 "data_offset": 2048, 00:16:23.232 "data_size": 63488 00:16:23.232 }, 00:16:23.232 { 00:16:23.232 "name": "BaseBdev2", 00:16:23.232 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:23.232 "is_configured": true, 00:16:23.232 "data_offset": 2048, 00:16:23.232 "data_size": 63488 00:16:23.233 }, 00:16:23.233 { 00:16:23.233 "name": "BaseBdev3", 00:16:23.233 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:23.233 "is_configured": true, 00:16:23.233 "data_offset": 2048, 00:16:23.233 "data_size": 63488 00:16:23.233 } 00:16:23.233 ] 00:16:23.233 }' 00:16:23.233 13:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.233 13:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:23.800 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:24.059 [2024-07-15 13:35:03.249963] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:24.059 "name": "Existed_Raid", 00:16:24.059 "aliases": [ 00:16:24.059 "375733d3-f1c8-443a-a390-db61cc937c2f" 00:16:24.059 ], 00:16:24.059 "product_name": "Raid Volume", 00:16:24.059 "block_size": 512, 00:16:24.059 "num_blocks": 63488, 00:16:24.059 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:24.059 "assigned_rate_limits": { 00:16:24.059 "rw_ios_per_sec": 0, 00:16:24.059 "rw_mbytes_per_sec": 0, 00:16:24.059 "r_mbytes_per_sec": 0, 00:16:24.059 "w_mbytes_per_sec": 0 00:16:24.059 }, 00:16:24.059 "claimed": false, 00:16:24.059 "zoned": false, 00:16:24.059 "supported_io_types": { 00:16:24.059 "read": true, 00:16:24.059 "write": true, 00:16:24.059 "unmap": false, 00:16:24.059 "flush": false, 00:16:24.059 "reset": true, 00:16:24.059 "nvme_admin": false, 00:16:24.059 "nvme_io": false, 00:16:24.059 "nvme_io_md": false, 00:16:24.059 "write_zeroes": true, 00:16:24.059 "zcopy": false, 00:16:24.059 "get_zone_info": false, 00:16:24.059 "zone_management": false, 00:16:24.059 "zone_append": false, 00:16:24.059 "compare": false, 00:16:24.059 "compare_and_write": false, 00:16:24.059 "abort": false, 00:16:24.059 "seek_hole": false, 00:16:24.059 "seek_data": false, 00:16:24.059 "copy": false, 00:16:24.059 "nvme_iov_md": false 00:16:24.059 }, 00:16:24.059 "memory_domains": [ 00:16:24.059 { 00:16:24.059 "dma_device_id": "system", 00:16:24.059 "dma_device_type": 1 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.059 "dma_device_type": 2 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "dma_device_id": "system", 00:16:24.059 "dma_device_type": 1 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.059 "dma_device_type": 2 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "dma_device_id": "system", 00:16:24.059 "dma_device_type": 1 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.059 "dma_device_type": 2 00:16:24.059 } 00:16:24.059 ], 00:16:24.059 "driver_specific": { 00:16:24.059 "raid": { 00:16:24.059 "uuid": "375733d3-f1c8-443a-a390-db61cc937c2f", 00:16:24.059 "strip_size_kb": 0, 00:16:24.059 "state": "online", 00:16:24.059 "raid_level": "raid1", 00:16:24.059 "superblock": true, 00:16:24.059 "num_base_bdevs": 3, 00:16:24.059 "num_base_bdevs_discovered": 3, 00:16:24.059 "num_base_bdevs_operational": 3, 00:16:24.059 "base_bdevs_list": [ 00:16:24.059 { 00:16:24.059 "name": "NewBaseBdev", 00:16:24.059 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:24.059 "is_configured": true, 00:16:24.059 "data_offset": 2048, 00:16:24.059 "data_size": 63488 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "name": "BaseBdev2", 00:16:24.059 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:24.059 "is_configured": true, 00:16:24.059 "data_offset": 2048, 00:16:24.059 "data_size": 63488 00:16:24.059 }, 00:16:24.059 { 00:16:24.059 "name": "BaseBdev3", 00:16:24.059 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:24.059 "is_configured": true, 00:16:24.059 "data_offset": 2048, 00:16:24.059 "data_size": 63488 00:16:24.059 } 00:16:24.059 ] 00:16:24.059 } 00:16:24.059 } 00:16:24.059 }' 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:24.059 BaseBdev2 00:16:24.059 BaseBdev3' 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:24.059 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.318 "name": "NewBaseBdev", 00:16:24.318 "aliases": [ 00:16:24.318 "ff0727fc-34b0-4e9e-bf07-bb96d0c36628" 00:16:24.318 ], 00:16:24.318 "product_name": "Malloc disk", 00:16:24.318 "block_size": 512, 00:16:24.318 "num_blocks": 65536, 00:16:24.318 "uuid": "ff0727fc-34b0-4e9e-bf07-bb96d0c36628", 00:16:24.318 "assigned_rate_limits": { 00:16:24.318 "rw_ios_per_sec": 0, 00:16:24.318 "rw_mbytes_per_sec": 0, 00:16:24.318 "r_mbytes_per_sec": 0, 00:16:24.318 "w_mbytes_per_sec": 0 00:16:24.318 }, 00:16:24.318 "claimed": true, 00:16:24.318 "claim_type": "exclusive_write", 00:16:24.318 "zoned": false, 00:16:24.318 "supported_io_types": { 00:16:24.318 "read": true, 00:16:24.318 "write": true, 00:16:24.318 "unmap": true, 00:16:24.318 "flush": true, 00:16:24.318 "reset": true, 00:16:24.318 "nvme_admin": false, 00:16:24.318 "nvme_io": false, 00:16:24.318 "nvme_io_md": false, 00:16:24.318 "write_zeroes": true, 00:16:24.318 "zcopy": true, 00:16:24.318 "get_zone_info": false, 00:16:24.318 "zone_management": false, 00:16:24.318 "zone_append": false, 00:16:24.318 "compare": false, 00:16:24.318 "compare_and_write": false, 00:16:24.318 "abort": true, 00:16:24.318 "seek_hole": false, 00:16:24.318 "seek_data": false, 00:16:24.318 "copy": true, 00:16:24.318 "nvme_iov_md": false 00:16:24.318 }, 00:16:24.318 "memory_domains": [ 00:16:24.318 { 00:16:24.318 "dma_device_id": "system", 00:16:24.318 "dma_device_type": 1 00:16:24.318 }, 00:16:24.318 { 00:16:24.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.318 "dma_device_type": 2 00:16:24.318 } 00:16:24.318 ], 00:16:24.318 "driver_specific": {} 00:16:24.318 }' 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.318 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:24.577 13:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.835 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.835 "name": "BaseBdev2", 00:16:24.835 "aliases": [ 00:16:24.835 "a8886faa-d2e7-4206-bf2b-8b0690e37bca" 00:16:24.835 ], 00:16:24.835 "product_name": "Malloc disk", 00:16:24.835 "block_size": 512, 00:16:24.835 "num_blocks": 65536, 00:16:24.835 "uuid": "a8886faa-d2e7-4206-bf2b-8b0690e37bca", 00:16:24.835 "assigned_rate_limits": { 00:16:24.835 "rw_ios_per_sec": 0, 00:16:24.835 "rw_mbytes_per_sec": 0, 00:16:24.835 "r_mbytes_per_sec": 0, 00:16:24.835 "w_mbytes_per_sec": 0 00:16:24.835 }, 00:16:24.835 "claimed": true, 00:16:24.835 "claim_type": "exclusive_write", 00:16:24.835 "zoned": false, 00:16:24.835 "supported_io_types": { 00:16:24.835 "read": true, 00:16:24.835 "write": true, 00:16:24.835 "unmap": true, 00:16:24.835 "flush": true, 00:16:24.835 "reset": true, 00:16:24.835 "nvme_admin": false, 00:16:24.835 "nvme_io": false, 00:16:24.835 "nvme_io_md": false, 00:16:24.835 "write_zeroes": true, 00:16:24.835 "zcopy": true, 00:16:24.835 "get_zone_info": false, 00:16:24.835 "zone_management": false, 00:16:24.835 "zone_append": false, 00:16:24.835 "compare": false, 00:16:24.835 "compare_and_write": false, 00:16:24.835 "abort": true, 00:16:24.835 "seek_hole": false, 00:16:24.835 "seek_data": false, 00:16:24.835 "copy": true, 00:16:24.835 "nvme_iov_md": false 00:16:24.835 }, 00:16:24.835 "memory_domains": [ 00:16:24.835 { 00:16:24.835 "dma_device_id": "system", 00:16:24.835 "dma_device_type": 1 00:16:24.835 }, 00:16:24.835 { 00:16:24.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.835 "dma_device_type": 2 00:16:24.835 } 00:16:24.835 ], 00:16:24.835 "driver_specific": {} 00:16:24.835 }' 00:16:24.835 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.835 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.835 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.835 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:25.093 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.351 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.351 "name": "BaseBdev3", 00:16:25.351 "aliases": [ 00:16:25.351 "ae3b37f0-2e94-41a7-8f1e-8a639e864d13" 00:16:25.351 ], 00:16:25.351 "product_name": "Malloc disk", 00:16:25.351 "block_size": 512, 00:16:25.351 "num_blocks": 65536, 00:16:25.351 "uuid": "ae3b37f0-2e94-41a7-8f1e-8a639e864d13", 00:16:25.351 "assigned_rate_limits": { 00:16:25.351 "rw_ios_per_sec": 0, 00:16:25.351 "rw_mbytes_per_sec": 0, 00:16:25.351 "r_mbytes_per_sec": 0, 00:16:25.351 "w_mbytes_per_sec": 0 00:16:25.351 }, 00:16:25.351 "claimed": true, 00:16:25.351 "claim_type": "exclusive_write", 00:16:25.351 "zoned": false, 00:16:25.351 "supported_io_types": { 00:16:25.351 "read": true, 00:16:25.351 "write": true, 00:16:25.351 "unmap": true, 00:16:25.351 "flush": true, 00:16:25.351 "reset": true, 00:16:25.351 "nvme_admin": false, 00:16:25.351 "nvme_io": false, 00:16:25.351 "nvme_io_md": false, 00:16:25.351 "write_zeroes": true, 00:16:25.351 "zcopy": true, 00:16:25.351 "get_zone_info": false, 00:16:25.351 "zone_management": false, 00:16:25.351 "zone_append": false, 00:16:25.351 "compare": false, 00:16:25.351 "compare_and_write": false, 00:16:25.351 "abort": true, 00:16:25.351 "seek_hole": false, 00:16:25.351 "seek_data": false, 00:16:25.351 "copy": true, 00:16:25.351 "nvme_iov_md": false 00:16:25.351 }, 00:16:25.351 "memory_domains": [ 00:16:25.351 { 00:16:25.351 "dma_device_id": "system", 00:16:25.351 "dma_device_type": 1 00:16:25.351 }, 00:16:25.351 { 00:16:25.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.351 "dma_device_type": 2 00:16:25.351 } 00:16:25.351 ], 00:16:25.351 "driver_specific": {} 00:16:25.351 }' 00:16:25.351 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.610 13:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.610 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.610 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.870 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.870 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.870 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:26.129 [2024-07-15 13:35:05.311155] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:26.129 [2024-07-15 13:35:05.311185] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:26.129 [2024-07-15 13:35:05.311242] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:26.129 [2024-07-15 13:35:05.311517] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:26.129 [2024-07-15 13:35:05.311530] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17431b0 name Existed_Raid, state offline 00:16:26.129 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2118281 00:16:26.129 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2118281 ']' 00:16:26.129 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2118281 00:16:26.129 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2118281 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2118281' 00:16:26.130 killing process with pid 2118281 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2118281 00:16:26.130 [2024-07-15 13:35:05.381323] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:26.130 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2118281 00:16:26.130 [2024-07-15 13:35:05.407526] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:26.389 13:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:26.389 00:16:26.389 real 0m27.814s 00:16:26.389 user 0m50.971s 00:16:26.389 sys 0m5.048s 00:16:26.389 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:26.389 13:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.389 ************************************ 00:16:26.389 END TEST raid_state_function_test_sb 00:16:26.389 ************************************ 00:16:26.389 13:35:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:26.389 13:35:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:26.389 13:35:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:26.389 13:35:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:26.389 13:35:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:26.389 ************************************ 00:16:26.389 START TEST raid_superblock_test 00:16:26.389 ************************************ 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:26.389 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2122479 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2122479 /var/tmp/spdk-raid.sock 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2122479 ']' 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:26.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:26.390 13:35:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.390 [2024-07-15 13:35:05.762316] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:16:26.390 [2024-07-15 13:35:05.762383] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2122479 ] 00:16:26.649 [2024-07-15 13:35:05.890767] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:26.649 [2024-07-15 13:35:05.988715] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:26.649 [2024-07-15 13:35:06.052568] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.649 [2024-07-15 13:35:06.052615] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:27.586 malloc1 00:16:27.586 13:35:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:27.845 [2024-07-15 13:35:07.167229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:27.845 [2024-07-15 13:35:07.167285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.845 [2024-07-15 13:35:07.167307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1adf570 00:16:27.845 [2024-07-15 13:35:07.167321] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.845 [2024-07-15 13:35:07.169063] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.845 [2024-07-15 13:35:07.169093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:27.845 pt1 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:27.845 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:28.104 malloc2 00:16:28.104 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:28.362 [2024-07-15 13:35:07.669453] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:28.362 [2024-07-15 13:35:07.669500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.362 [2024-07-15 13:35:07.669518] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ae0970 00:16:28.362 [2024-07-15 13:35:07.669530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.362 [2024-07-15 13:35:07.670974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.362 [2024-07-15 13:35:07.671001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:28.362 pt2 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:28.362 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:28.621 malloc3 00:16:28.621 13:35:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:28.879 [2024-07-15 13:35:08.163366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:28.879 [2024-07-15 13:35:08.163412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.879 [2024-07-15 13:35:08.163428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c77340 00:16:28.879 [2024-07-15 13:35:08.163441] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.879 [2024-07-15 13:35:08.164804] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.879 [2024-07-15 13:35:08.164829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:28.879 pt3 00:16:28.879 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:28.879 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:28.879 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:29.139 [2024-07-15 13:35:08.408040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:29.139 [2024-07-15 13:35:08.409271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:29.139 [2024-07-15 13:35:08.409326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:29.139 [2024-07-15 13:35:08.409478] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad7ea0 00:16:29.139 [2024-07-15 13:35:08.409490] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:29.139 [2024-07-15 13:35:08.409680] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1adf240 00:16:29.139 [2024-07-15 13:35:08.409824] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad7ea0 00:16:29.139 [2024-07-15 13:35:08.409834] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad7ea0 00:16:29.139 [2024-07-15 13:35:08.409936] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.139 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:29.399 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.399 "name": "raid_bdev1", 00:16:29.399 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:29.399 "strip_size_kb": 0, 00:16:29.399 "state": "online", 00:16:29.399 "raid_level": "raid1", 00:16:29.399 "superblock": true, 00:16:29.399 "num_base_bdevs": 3, 00:16:29.399 "num_base_bdevs_discovered": 3, 00:16:29.399 "num_base_bdevs_operational": 3, 00:16:29.399 "base_bdevs_list": [ 00:16:29.399 { 00:16:29.399 "name": "pt1", 00:16:29.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:29.399 "is_configured": true, 00:16:29.399 "data_offset": 2048, 00:16:29.399 "data_size": 63488 00:16:29.399 }, 00:16:29.399 { 00:16:29.399 "name": "pt2", 00:16:29.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:29.399 "is_configured": true, 00:16:29.399 "data_offset": 2048, 00:16:29.399 "data_size": 63488 00:16:29.399 }, 00:16:29.399 { 00:16:29.399 "name": "pt3", 00:16:29.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:29.399 "is_configured": true, 00:16:29.399 "data_offset": 2048, 00:16:29.399 "data_size": 63488 00:16:29.399 } 00:16:29.399 ] 00:16:29.399 }' 00:16:29.399 13:35:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.399 13:35:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:29.966 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:30.225 [2024-07-15 13:35:09.443086] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:30.225 "name": "raid_bdev1", 00:16:30.225 "aliases": [ 00:16:30.225 "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4" 00:16:30.225 ], 00:16:30.225 "product_name": "Raid Volume", 00:16:30.225 "block_size": 512, 00:16:30.225 "num_blocks": 63488, 00:16:30.225 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:30.225 "assigned_rate_limits": { 00:16:30.225 "rw_ios_per_sec": 0, 00:16:30.225 "rw_mbytes_per_sec": 0, 00:16:30.225 "r_mbytes_per_sec": 0, 00:16:30.225 "w_mbytes_per_sec": 0 00:16:30.225 }, 00:16:30.225 "claimed": false, 00:16:30.225 "zoned": false, 00:16:30.225 "supported_io_types": { 00:16:30.225 "read": true, 00:16:30.225 "write": true, 00:16:30.225 "unmap": false, 00:16:30.225 "flush": false, 00:16:30.225 "reset": true, 00:16:30.225 "nvme_admin": false, 00:16:30.225 "nvme_io": false, 00:16:30.225 "nvme_io_md": false, 00:16:30.225 "write_zeroes": true, 00:16:30.225 "zcopy": false, 00:16:30.225 "get_zone_info": false, 00:16:30.225 "zone_management": false, 00:16:30.225 "zone_append": false, 00:16:30.225 "compare": false, 00:16:30.225 "compare_and_write": false, 00:16:30.225 "abort": false, 00:16:30.225 "seek_hole": false, 00:16:30.225 "seek_data": false, 00:16:30.225 "copy": false, 00:16:30.225 "nvme_iov_md": false 00:16:30.225 }, 00:16:30.225 "memory_domains": [ 00:16:30.225 { 00:16:30.225 "dma_device_id": "system", 00:16:30.225 "dma_device_type": 1 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.225 "dma_device_type": 2 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "dma_device_id": "system", 00:16:30.225 "dma_device_type": 1 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.225 "dma_device_type": 2 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "dma_device_id": "system", 00:16:30.225 "dma_device_type": 1 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.225 "dma_device_type": 2 00:16:30.225 } 00:16:30.225 ], 00:16:30.225 "driver_specific": { 00:16:30.225 "raid": { 00:16:30.225 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:30.225 "strip_size_kb": 0, 00:16:30.225 "state": "online", 00:16:30.225 "raid_level": "raid1", 00:16:30.225 "superblock": true, 00:16:30.225 "num_base_bdevs": 3, 00:16:30.225 "num_base_bdevs_discovered": 3, 00:16:30.225 "num_base_bdevs_operational": 3, 00:16:30.225 "base_bdevs_list": [ 00:16:30.225 { 00:16:30.225 "name": "pt1", 00:16:30.225 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:30.225 "is_configured": true, 00:16:30.225 "data_offset": 2048, 00:16:30.225 "data_size": 63488 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "name": "pt2", 00:16:30.225 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:30.225 "is_configured": true, 00:16:30.225 "data_offset": 2048, 00:16:30.225 "data_size": 63488 00:16:30.225 }, 00:16:30.225 { 00:16:30.225 "name": "pt3", 00:16:30.225 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:30.225 "is_configured": true, 00:16:30.225 "data_offset": 2048, 00:16:30.225 "data_size": 63488 00:16:30.225 } 00:16:30.225 ] 00:16:30.225 } 00:16:30.225 } 00:16:30.225 }' 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:30.225 pt2 00:16:30.225 pt3' 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:30.225 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.484 "name": "pt1", 00:16:30.484 "aliases": [ 00:16:30.484 "00000000-0000-0000-0000-000000000001" 00:16:30.484 ], 00:16:30.484 "product_name": "passthru", 00:16:30.484 "block_size": 512, 00:16:30.484 "num_blocks": 65536, 00:16:30.484 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:30.484 "assigned_rate_limits": { 00:16:30.484 "rw_ios_per_sec": 0, 00:16:30.484 "rw_mbytes_per_sec": 0, 00:16:30.484 "r_mbytes_per_sec": 0, 00:16:30.484 "w_mbytes_per_sec": 0 00:16:30.484 }, 00:16:30.484 "claimed": true, 00:16:30.484 "claim_type": "exclusive_write", 00:16:30.484 "zoned": false, 00:16:30.484 "supported_io_types": { 00:16:30.484 "read": true, 00:16:30.484 "write": true, 00:16:30.484 "unmap": true, 00:16:30.484 "flush": true, 00:16:30.484 "reset": true, 00:16:30.484 "nvme_admin": false, 00:16:30.484 "nvme_io": false, 00:16:30.484 "nvme_io_md": false, 00:16:30.484 "write_zeroes": true, 00:16:30.484 "zcopy": true, 00:16:30.484 "get_zone_info": false, 00:16:30.484 "zone_management": false, 00:16:30.484 "zone_append": false, 00:16:30.484 "compare": false, 00:16:30.484 "compare_and_write": false, 00:16:30.484 "abort": true, 00:16:30.484 "seek_hole": false, 00:16:30.484 "seek_data": false, 00:16:30.484 "copy": true, 00:16:30.484 "nvme_iov_md": false 00:16:30.484 }, 00:16:30.484 "memory_domains": [ 00:16:30.484 { 00:16:30.484 "dma_device_id": "system", 00:16:30.484 "dma_device_type": 1 00:16:30.484 }, 00:16:30.484 { 00:16:30.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.484 "dma_device_type": 2 00:16:30.484 } 00:16:30.484 ], 00:16:30.484 "driver_specific": { 00:16:30.484 "passthru": { 00:16:30.484 "name": "pt1", 00:16:30.484 "base_bdev_name": "malloc1" 00:16:30.484 } 00:16:30.484 } 00:16:30.484 }' 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.484 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.742 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:30.742 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.742 13:35:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:30.742 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.999 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.999 "name": "pt2", 00:16:30.999 "aliases": [ 00:16:30.999 "00000000-0000-0000-0000-000000000002" 00:16:30.999 ], 00:16:30.999 "product_name": "passthru", 00:16:30.999 "block_size": 512, 00:16:30.999 "num_blocks": 65536, 00:16:30.999 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:30.999 "assigned_rate_limits": { 00:16:30.999 "rw_ios_per_sec": 0, 00:16:30.999 "rw_mbytes_per_sec": 0, 00:16:30.999 "r_mbytes_per_sec": 0, 00:16:30.999 "w_mbytes_per_sec": 0 00:16:30.999 }, 00:16:30.999 "claimed": true, 00:16:30.999 "claim_type": "exclusive_write", 00:16:30.999 "zoned": false, 00:16:31.000 "supported_io_types": { 00:16:31.000 "read": true, 00:16:31.000 "write": true, 00:16:31.000 "unmap": true, 00:16:31.000 "flush": true, 00:16:31.000 "reset": true, 00:16:31.000 "nvme_admin": false, 00:16:31.000 "nvme_io": false, 00:16:31.000 "nvme_io_md": false, 00:16:31.000 "write_zeroes": true, 00:16:31.000 "zcopy": true, 00:16:31.000 "get_zone_info": false, 00:16:31.000 "zone_management": false, 00:16:31.000 "zone_append": false, 00:16:31.000 "compare": false, 00:16:31.000 "compare_and_write": false, 00:16:31.000 "abort": true, 00:16:31.000 "seek_hole": false, 00:16:31.000 "seek_data": false, 00:16:31.000 "copy": true, 00:16:31.000 "nvme_iov_md": false 00:16:31.000 }, 00:16:31.000 "memory_domains": [ 00:16:31.000 { 00:16:31.000 "dma_device_id": "system", 00:16:31.000 "dma_device_type": 1 00:16:31.000 }, 00:16:31.000 { 00:16:31.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.000 "dma_device_type": 2 00:16:31.000 } 00:16:31.000 ], 00:16:31.000 "driver_specific": { 00:16:31.000 "passthru": { 00:16:31.000 "name": "pt2", 00:16:31.000 "base_bdev_name": "malloc2" 00:16:31.000 } 00:16:31.000 } 00:16:31.000 }' 00:16:31.000 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.000 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.258 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.515 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:31.515 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:31.515 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:31.515 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:31.773 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:31.773 "name": "pt3", 00:16:31.773 "aliases": [ 00:16:31.773 "00000000-0000-0000-0000-000000000003" 00:16:31.773 ], 00:16:31.773 "product_name": "passthru", 00:16:31.773 "block_size": 512, 00:16:31.773 "num_blocks": 65536, 00:16:31.773 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:31.773 "assigned_rate_limits": { 00:16:31.773 "rw_ios_per_sec": 0, 00:16:31.773 "rw_mbytes_per_sec": 0, 00:16:31.773 "r_mbytes_per_sec": 0, 00:16:31.773 "w_mbytes_per_sec": 0 00:16:31.773 }, 00:16:31.773 "claimed": true, 00:16:31.773 "claim_type": "exclusive_write", 00:16:31.773 "zoned": false, 00:16:31.773 "supported_io_types": { 00:16:31.773 "read": true, 00:16:31.773 "write": true, 00:16:31.773 "unmap": true, 00:16:31.773 "flush": true, 00:16:31.773 "reset": true, 00:16:31.773 "nvme_admin": false, 00:16:31.773 "nvme_io": false, 00:16:31.773 "nvme_io_md": false, 00:16:31.773 "write_zeroes": true, 00:16:31.773 "zcopy": true, 00:16:31.773 "get_zone_info": false, 00:16:31.773 "zone_management": false, 00:16:31.773 "zone_append": false, 00:16:31.773 "compare": false, 00:16:31.773 "compare_and_write": false, 00:16:31.773 "abort": true, 00:16:31.773 "seek_hole": false, 00:16:31.773 "seek_data": false, 00:16:31.773 "copy": true, 00:16:31.773 "nvme_iov_md": false 00:16:31.773 }, 00:16:31.773 "memory_domains": [ 00:16:31.773 { 00:16:31.773 "dma_device_id": "system", 00:16:31.773 "dma_device_type": 1 00:16:31.773 }, 00:16:31.773 { 00:16:31.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.773 "dma_device_type": 2 00:16:31.773 } 00:16:31.773 ], 00:16:31.773 "driver_specific": { 00:16:31.773 "passthru": { 00:16:31.773 "name": "pt3", 00:16:31.773 "base_bdev_name": "malloc3" 00:16:31.773 } 00:16:31.773 } 00:16:31.773 }' 00:16:31.773 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.773 13:35:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:31.773 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.031 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.031 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.031 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:32.031 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:32.289 [2024-07-15 13:35:11.476486] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.289 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 00:16:32.289 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 ']' 00:16:32.289 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:32.547 [2024-07-15 13:35:11.720843] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:32.547 [2024-07-15 13:35:11.720865] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:32.547 [2024-07-15 13:35:11.720917] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:32.547 [2024-07-15 13:35:11.720994] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:32.547 [2024-07-15 13:35:11.721008] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad7ea0 name raid_bdev1, state offline 00:16:32.547 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.547 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:32.804 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:32.804 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:32.804 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:32.804 13:35:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:32.804 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:32.804 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:33.061 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:33.061 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:33.318 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:33.318 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:33.575 13:35:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:33.832 [2024-07-15 13:35:13.208706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:33.832 [2024-07-15 13:35:13.210072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:33.832 [2024-07-15 13:35:13.210116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:33.832 [2024-07-15 13:35:13.210162] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:33.832 [2024-07-15 13:35:13.210204] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:33.832 [2024-07-15 13:35:13.210227] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:33.832 [2024-07-15 13:35:13.210253] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:33.832 [2024-07-15 13:35:13.210263] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c82ff0 name raid_bdev1, state configuring 00:16:33.832 request: 00:16:33.832 { 00:16:33.832 "name": "raid_bdev1", 00:16:33.832 "raid_level": "raid1", 00:16:33.832 "base_bdevs": [ 00:16:33.832 "malloc1", 00:16:33.832 "malloc2", 00:16:33.832 "malloc3" 00:16:33.832 ], 00:16:33.832 "superblock": false, 00:16:33.832 "method": "bdev_raid_create", 00:16:33.832 "req_id": 1 00:16:33.832 } 00:16:33.832 Got JSON-RPC error response 00:16:33.832 response: 00:16:33.832 { 00:16:33.832 "code": -17, 00:16:33.832 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:33.832 } 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.832 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:34.114 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:34.114 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:34.114 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:34.418 [2024-07-15 13:35:13.685903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:34.418 [2024-07-15 13:35:13.685950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:34.418 [2024-07-15 13:35:13.685971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1adf7a0 00:16:34.418 [2024-07-15 13:35:13.685984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:34.418 [2024-07-15 13:35:13.687604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:34.418 [2024-07-15 13:35:13.687633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:34.418 [2024-07-15 13:35:13.687701] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:34.418 [2024-07-15 13:35:13.687730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:34.418 pt1 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.418 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:34.676 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.676 "name": "raid_bdev1", 00:16:34.676 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:34.676 "strip_size_kb": 0, 00:16:34.676 "state": "configuring", 00:16:34.676 "raid_level": "raid1", 00:16:34.676 "superblock": true, 00:16:34.676 "num_base_bdevs": 3, 00:16:34.676 "num_base_bdevs_discovered": 1, 00:16:34.676 "num_base_bdevs_operational": 3, 00:16:34.676 "base_bdevs_list": [ 00:16:34.676 { 00:16:34.676 "name": "pt1", 00:16:34.676 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:34.676 "is_configured": true, 00:16:34.676 "data_offset": 2048, 00:16:34.676 "data_size": 63488 00:16:34.676 }, 00:16:34.676 { 00:16:34.676 "name": null, 00:16:34.676 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:34.676 "is_configured": false, 00:16:34.676 "data_offset": 2048, 00:16:34.676 "data_size": 63488 00:16:34.676 }, 00:16:34.676 { 00:16:34.676 "name": null, 00:16:34.676 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:34.676 "is_configured": false, 00:16:34.676 "data_offset": 2048, 00:16:34.676 "data_size": 63488 00:16:34.676 } 00:16:34.676 ] 00:16:34.676 }' 00:16:34.676 13:35:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.676 13:35:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.242 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:35.242 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:35.500 [2024-07-15 13:35:14.716807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:35.500 [2024-07-15 13:35:14.716861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:35.500 [2024-07-15 13:35:14.716881] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad6a10 00:16:35.500 [2024-07-15 13:35:14.716894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:35.500 [2024-07-15 13:35:14.717267] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:35.500 [2024-07-15 13:35:14.717285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:35.500 [2024-07-15 13:35:14.717351] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:35.500 [2024-07-15 13:35:14.717370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:35.500 pt2 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:35.500 [2024-07-15 13:35:14.897304] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.500 13:35:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:35.759 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.759 "name": "raid_bdev1", 00:16:35.759 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:35.759 "strip_size_kb": 0, 00:16:35.759 "state": "configuring", 00:16:35.759 "raid_level": "raid1", 00:16:35.759 "superblock": true, 00:16:35.759 "num_base_bdevs": 3, 00:16:35.759 "num_base_bdevs_discovered": 1, 00:16:35.759 "num_base_bdevs_operational": 3, 00:16:35.759 "base_bdevs_list": [ 00:16:35.759 { 00:16:35.759 "name": "pt1", 00:16:35.759 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:35.759 "is_configured": true, 00:16:35.759 "data_offset": 2048, 00:16:35.759 "data_size": 63488 00:16:35.759 }, 00:16:35.759 { 00:16:35.759 "name": null, 00:16:35.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:35.759 "is_configured": false, 00:16:35.759 "data_offset": 2048, 00:16:35.759 "data_size": 63488 00:16:35.759 }, 00:16:35.759 { 00:16:35.759 "name": null, 00:16:35.759 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:35.759 "is_configured": false, 00:16:35.759 "data_offset": 2048, 00:16:35.759 "data_size": 63488 00:16:35.759 } 00:16:35.759 ] 00:16:35.759 }' 00:16:35.759 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.759 13:35:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.325 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:36.325 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:36.326 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:36.584 [2024-07-15 13:35:15.899959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:36.584 [2024-07-15 13:35:15.900014] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.584 [2024-07-15 13:35:15.900035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1adfa10 00:16:36.584 [2024-07-15 13:35:15.900048] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.584 [2024-07-15 13:35:15.900399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.584 [2024-07-15 13:35:15.900416] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:36.584 [2024-07-15 13:35:15.900485] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:36.584 [2024-07-15 13:35:15.900503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:36.584 pt2 00:16:36.584 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:36.584 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:36.584 13:35:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:36.843 [2024-07-15 13:35:16.144592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:36.843 [2024-07-15 13:35:16.144628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.843 [2024-07-15 13:35:16.144645] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad66c0 00:16:36.843 [2024-07-15 13:35:16.144657] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.843 [2024-07-15 13:35:16.144949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.843 [2024-07-15 13:35:16.144967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:36.843 [2024-07-15 13:35:16.145020] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:36.843 [2024-07-15 13:35:16.145037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:36.843 [2024-07-15 13:35:16.145142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c79c00 00:16:36.843 [2024-07-15 13:35:16.145153] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:36.843 [2024-07-15 13:35:16.145318] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad9610 00:16:36.843 [2024-07-15 13:35:16.145446] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c79c00 00:16:36.843 [2024-07-15 13:35:16.145456] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c79c00 00:16:36.843 [2024-07-15 13:35:16.145552] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:36.843 pt3 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.843 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:37.101 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.101 "name": "raid_bdev1", 00:16:37.102 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:37.102 "strip_size_kb": 0, 00:16:37.102 "state": "online", 00:16:37.102 "raid_level": "raid1", 00:16:37.102 "superblock": true, 00:16:37.102 "num_base_bdevs": 3, 00:16:37.102 "num_base_bdevs_discovered": 3, 00:16:37.102 "num_base_bdevs_operational": 3, 00:16:37.102 "base_bdevs_list": [ 00:16:37.102 { 00:16:37.102 "name": "pt1", 00:16:37.102 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:37.102 "is_configured": true, 00:16:37.102 "data_offset": 2048, 00:16:37.102 "data_size": 63488 00:16:37.102 }, 00:16:37.102 { 00:16:37.102 "name": "pt2", 00:16:37.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:37.102 "is_configured": true, 00:16:37.102 "data_offset": 2048, 00:16:37.102 "data_size": 63488 00:16:37.102 }, 00:16:37.102 { 00:16:37.102 "name": "pt3", 00:16:37.102 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:37.102 "is_configured": true, 00:16:37.102 "data_offset": 2048, 00:16:37.102 "data_size": 63488 00:16:37.102 } 00:16:37.102 ] 00:16:37.102 }' 00:16:37.102 13:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.102 13:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:37.668 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:37.926 [2024-07-15 13:35:17.243829] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:37.926 "name": "raid_bdev1", 00:16:37.926 "aliases": [ 00:16:37.926 "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4" 00:16:37.926 ], 00:16:37.926 "product_name": "Raid Volume", 00:16:37.926 "block_size": 512, 00:16:37.926 "num_blocks": 63488, 00:16:37.926 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:37.926 "assigned_rate_limits": { 00:16:37.926 "rw_ios_per_sec": 0, 00:16:37.926 "rw_mbytes_per_sec": 0, 00:16:37.926 "r_mbytes_per_sec": 0, 00:16:37.926 "w_mbytes_per_sec": 0 00:16:37.926 }, 00:16:37.926 "claimed": false, 00:16:37.926 "zoned": false, 00:16:37.926 "supported_io_types": { 00:16:37.926 "read": true, 00:16:37.926 "write": true, 00:16:37.926 "unmap": false, 00:16:37.926 "flush": false, 00:16:37.926 "reset": true, 00:16:37.926 "nvme_admin": false, 00:16:37.926 "nvme_io": false, 00:16:37.926 "nvme_io_md": false, 00:16:37.926 "write_zeroes": true, 00:16:37.926 "zcopy": false, 00:16:37.926 "get_zone_info": false, 00:16:37.926 "zone_management": false, 00:16:37.926 "zone_append": false, 00:16:37.926 "compare": false, 00:16:37.926 "compare_and_write": false, 00:16:37.926 "abort": false, 00:16:37.926 "seek_hole": false, 00:16:37.926 "seek_data": false, 00:16:37.926 "copy": false, 00:16:37.926 "nvme_iov_md": false 00:16:37.926 }, 00:16:37.926 "memory_domains": [ 00:16:37.926 { 00:16:37.926 "dma_device_id": "system", 00:16:37.926 "dma_device_type": 1 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.926 "dma_device_type": 2 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "dma_device_id": "system", 00:16:37.926 "dma_device_type": 1 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.926 "dma_device_type": 2 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "dma_device_id": "system", 00:16:37.926 "dma_device_type": 1 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.926 "dma_device_type": 2 00:16:37.926 } 00:16:37.926 ], 00:16:37.926 "driver_specific": { 00:16:37.926 "raid": { 00:16:37.926 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:37.926 "strip_size_kb": 0, 00:16:37.926 "state": "online", 00:16:37.926 "raid_level": "raid1", 00:16:37.926 "superblock": true, 00:16:37.926 "num_base_bdevs": 3, 00:16:37.926 "num_base_bdevs_discovered": 3, 00:16:37.926 "num_base_bdevs_operational": 3, 00:16:37.926 "base_bdevs_list": [ 00:16:37.926 { 00:16:37.926 "name": "pt1", 00:16:37.926 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:37.926 "is_configured": true, 00:16:37.926 "data_offset": 2048, 00:16:37.926 "data_size": 63488 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "name": "pt2", 00:16:37.926 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:37.926 "is_configured": true, 00:16:37.926 "data_offset": 2048, 00:16:37.926 "data_size": 63488 00:16:37.926 }, 00:16:37.926 { 00:16:37.926 "name": "pt3", 00:16:37.926 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:37.926 "is_configured": true, 00:16:37.926 "data_offset": 2048, 00:16:37.926 "data_size": 63488 00:16:37.926 } 00:16:37.926 ] 00:16:37.926 } 00:16:37.926 } 00:16:37.926 }' 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:37.926 pt2 00:16:37.926 pt3' 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:37.926 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.185 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.185 "name": "pt1", 00:16:38.185 "aliases": [ 00:16:38.185 "00000000-0000-0000-0000-000000000001" 00:16:38.185 ], 00:16:38.185 "product_name": "passthru", 00:16:38.185 "block_size": 512, 00:16:38.185 "num_blocks": 65536, 00:16:38.185 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:38.185 "assigned_rate_limits": { 00:16:38.185 "rw_ios_per_sec": 0, 00:16:38.185 "rw_mbytes_per_sec": 0, 00:16:38.185 "r_mbytes_per_sec": 0, 00:16:38.185 "w_mbytes_per_sec": 0 00:16:38.185 }, 00:16:38.185 "claimed": true, 00:16:38.185 "claim_type": "exclusive_write", 00:16:38.185 "zoned": false, 00:16:38.185 "supported_io_types": { 00:16:38.185 "read": true, 00:16:38.185 "write": true, 00:16:38.185 "unmap": true, 00:16:38.185 "flush": true, 00:16:38.185 "reset": true, 00:16:38.185 "nvme_admin": false, 00:16:38.185 "nvme_io": false, 00:16:38.185 "nvme_io_md": false, 00:16:38.185 "write_zeroes": true, 00:16:38.185 "zcopy": true, 00:16:38.185 "get_zone_info": false, 00:16:38.185 "zone_management": false, 00:16:38.185 "zone_append": false, 00:16:38.185 "compare": false, 00:16:38.185 "compare_and_write": false, 00:16:38.185 "abort": true, 00:16:38.185 "seek_hole": false, 00:16:38.185 "seek_data": false, 00:16:38.185 "copy": true, 00:16:38.185 "nvme_iov_md": false 00:16:38.185 }, 00:16:38.185 "memory_domains": [ 00:16:38.185 { 00:16:38.185 "dma_device_id": "system", 00:16:38.185 "dma_device_type": 1 00:16:38.185 }, 00:16:38.185 { 00:16:38.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.185 "dma_device_type": 2 00:16:38.185 } 00:16:38.185 ], 00:16:38.185 "driver_specific": { 00:16:38.185 "passthru": { 00:16:38.185 "name": "pt1", 00:16:38.185 "base_bdev_name": "malloc1" 00:16:38.185 } 00:16:38.185 } 00:16:38.185 }' 00:16:38.185 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.185 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.443 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.702 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.702 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.702 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:38.702 13:35:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.960 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.960 "name": "pt2", 00:16:38.960 "aliases": [ 00:16:38.960 "00000000-0000-0000-0000-000000000002" 00:16:38.960 ], 00:16:38.960 "product_name": "passthru", 00:16:38.960 "block_size": 512, 00:16:38.960 "num_blocks": 65536, 00:16:38.960 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:38.960 "assigned_rate_limits": { 00:16:38.960 "rw_ios_per_sec": 0, 00:16:38.960 "rw_mbytes_per_sec": 0, 00:16:38.960 "r_mbytes_per_sec": 0, 00:16:38.960 "w_mbytes_per_sec": 0 00:16:38.960 }, 00:16:38.960 "claimed": true, 00:16:38.960 "claim_type": "exclusive_write", 00:16:38.960 "zoned": false, 00:16:38.960 "supported_io_types": { 00:16:38.960 "read": true, 00:16:38.960 "write": true, 00:16:38.960 "unmap": true, 00:16:38.960 "flush": true, 00:16:38.960 "reset": true, 00:16:38.960 "nvme_admin": false, 00:16:38.960 "nvme_io": false, 00:16:38.960 "nvme_io_md": false, 00:16:38.960 "write_zeroes": true, 00:16:38.960 "zcopy": true, 00:16:38.960 "get_zone_info": false, 00:16:38.960 "zone_management": false, 00:16:38.960 "zone_append": false, 00:16:38.960 "compare": false, 00:16:38.960 "compare_and_write": false, 00:16:38.960 "abort": true, 00:16:38.960 "seek_hole": false, 00:16:38.961 "seek_data": false, 00:16:38.961 "copy": true, 00:16:38.961 "nvme_iov_md": false 00:16:38.961 }, 00:16:38.961 "memory_domains": [ 00:16:38.961 { 00:16:38.961 "dma_device_id": "system", 00:16:38.961 "dma_device_type": 1 00:16:38.961 }, 00:16:38.961 { 00:16:38.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.961 "dma_device_type": 2 00:16:38.961 } 00:16:38.961 ], 00:16:38.961 "driver_specific": { 00:16:38.961 "passthru": { 00:16:38.961 "name": "pt2", 00:16:38.961 "base_bdev_name": "malloc2" 00:16:38.961 } 00:16:38.961 } 00:16:38.961 }' 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.961 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.218 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.476 "name": "pt3", 00:16:39.476 "aliases": [ 00:16:39.476 "00000000-0000-0000-0000-000000000003" 00:16:39.476 ], 00:16:39.476 "product_name": "passthru", 00:16:39.476 "block_size": 512, 00:16:39.476 "num_blocks": 65536, 00:16:39.476 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:39.476 "assigned_rate_limits": { 00:16:39.476 "rw_ios_per_sec": 0, 00:16:39.476 "rw_mbytes_per_sec": 0, 00:16:39.476 "r_mbytes_per_sec": 0, 00:16:39.476 "w_mbytes_per_sec": 0 00:16:39.476 }, 00:16:39.476 "claimed": true, 00:16:39.476 "claim_type": "exclusive_write", 00:16:39.476 "zoned": false, 00:16:39.476 "supported_io_types": { 00:16:39.476 "read": true, 00:16:39.476 "write": true, 00:16:39.476 "unmap": true, 00:16:39.476 "flush": true, 00:16:39.476 "reset": true, 00:16:39.476 "nvme_admin": false, 00:16:39.476 "nvme_io": false, 00:16:39.476 "nvme_io_md": false, 00:16:39.476 "write_zeroes": true, 00:16:39.476 "zcopy": true, 00:16:39.476 "get_zone_info": false, 00:16:39.476 "zone_management": false, 00:16:39.476 "zone_append": false, 00:16:39.476 "compare": false, 00:16:39.476 "compare_and_write": false, 00:16:39.476 "abort": true, 00:16:39.476 "seek_hole": false, 00:16:39.476 "seek_data": false, 00:16:39.476 "copy": true, 00:16:39.476 "nvme_iov_md": false 00:16:39.476 }, 00:16:39.476 "memory_domains": [ 00:16:39.476 { 00:16:39.476 "dma_device_id": "system", 00:16:39.476 "dma_device_type": 1 00:16:39.476 }, 00:16:39.476 { 00:16:39.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.476 "dma_device_type": 2 00:16:39.476 } 00:16:39.476 ], 00:16:39.476 "driver_specific": { 00:16:39.476 "passthru": { 00:16:39.476 "name": "pt3", 00:16:39.476 "base_bdev_name": "malloc3" 00:16:39.476 } 00:16:39.476 } 00:16:39.476 }' 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.476 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.733 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.733 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.733 13:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.733 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.733 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.733 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.733 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.734 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:39.734 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:39.991 [2024-07-15 13:35:19.317332] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:39.991 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 '!=' fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 ']' 00:16:39.991 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:16:39.991 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:39.991 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:39.991 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:40.248 [2024-07-15 13:35:19.565745] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.249 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:40.506 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.506 "name": "raid_bdev1", 00:16:40.506 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:40.506 "strip_size_kb": 0, 00:16:40.506 "state": "online", 00:16:40.506 "raid_level": "raid1", 00:16:40.506 "superblock": true, 00:16:40.506 "num_base_bdevs": 3, 00:16:40.506 "num_base_bdevs_discovered": 2, 00:16:40.506 "num_base_bdevs_operational": 2, 00:16:40.506 "base_bdevs_list": [ 00:16:40.506 { 00:16:40.506 "name": null, 00:16:40.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.506 "is_configured": false, 00:16:40.506 "data_offset": 2048, 00:16:40.506 "data_size": 63488 00:16:40.506 }, 00:16:40.506 { 00:16:40.506 "name": "pt2", 00:16:40.506 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:40.506 "is_configured": true, 00:16:40.507 "data_offset": 2048, 00:16:40.507 "data_size": 63488 00:16:40.507 }, 00:16:40.507 { 00:16:40.507 "name": "pt3", 00:16:40.507 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:40.507 "is_configured": true, 00:16:40.507 "data_offset": 2048, 00:16:40.507 "data_size": 63488 00:16:40.507 } 00:16:40.507 ] 00:16:40.507 }' 00:16:40.507 13:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.507 13:35:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.072 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:41.329 [2024-07-15 13:35:20.652584] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:41.329 [2024-07-15 13:35:20.652615] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:41.329 [2024-07-15 13:35:20.652675] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:41.329 [2024-07-15 13:35:20.652730] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:41.329 [2024-07-15 13:35:20.652747] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c79c00 name raid_bdev1, state offline 00:16:41.329 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.329 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:16:41.586 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:16:41.586 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:16:41.586 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:16:41.586 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:41.586 13:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:41.843 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:42.101 [2024-07-15 13:35:21.410747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:42.101 [2024-07-15 13:35:21.410795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:42.101 [2024-07-15 13:35:21.410812] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad7310 00:16:42.101 [2024-07-15 13:35:21.410825] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:42.101 [2024-07-15 13:35:21.412430] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:42.101 [2024-07-15 13:35:21.412459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:42.101 [2024-07-15 13:35:21.412525] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:42.101 [2024-07-15 13:35:21.412552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:42.101 pt2 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.101 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:42.358 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.358 "name": "raid_bdev1", 00:16:42.358 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:42.358 "strip_size_kb": 0, 00:16:42.358 "state": "configuring", 00:16:42.358 "raid_level": "raid1", 00:16:42.358 "superblock": true, 00:16:42.358 "num_base_bdevs": 3, 00:16:42.358 "num_base_bdevs_discovered": 1, 00:16:42.358 "num_base_bdevs_operational": 2, 00:16:42.358 "base_bdevs_list": [ 00:16:42.358 { 00:16:42.358 "name": null, 00:16:42.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.358 "is_configured": false, 00:16:42.358 "data_offset": 2048, 00:16:42.358 "data_size": 63488 00:16:42.358 }, 00:16:42.358 { 00:16:42.358 "name": "pt2", 00:16:42.358 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:42.358 "is_configured": true, 00:16:42.358 "data_offset": 2048, 00:16:42.358 "data_size": 63488 00:16:42.358 }, 00:16:42.358 { 00:16:42.358 "name": null, 00:16:42.358 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:42.358 "is_configured": false, 00:16:42.358 "data_offset": 2048, 00:16:42.358 "data_size": 63488 00:16:42.358 } 00:16:42.358 ] 00:16:42.358 }' 00:16:42.358 13:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.358 13:35:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.922 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:16:42.922 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:42.922 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:16:42.922 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:43.179 [2024-07-15 13:35:22.433472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:43.179 [2024-07-15 13:35:22.433529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:43.179 [2024-07-15 13:35:22.433552] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad5ec0 00:16:43.179 [2024-07-15 13:35:22.433573] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:43.179 [2024-07-15 13:35:22.433949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:43.179 [2024-07-15 13:35:22.433969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:43.179 [2024-07-15 13:35:22.434035] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:43.179 [2024-07-15 13:35:22.434056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:43.179 [2024-07-15 13:35:22.434162] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c77cc0 00:16:43.179 [2024-07-15 13:35:22.434173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:43.179 [2024-07-15 13:35:22.434342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c786d0 00:16:43.179 [2024-07-15 13:35:22.434469] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c77cc0 00:16:43.179 [2024-07-15 13:35:22.434479] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c77cc0 00:16:43.179 [2024-07-15 13:35:22.434579] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:43.179 pt3 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.179 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:43.436 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.436 "name": "raid_bdev1", 00:16:43.436 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:43.436 "strip_size_kb": 0, 00:16:43.436 "state": "online", 00:16:43.436 "raid_level": "raid1", 00:16:43.436 "superblock": true, 00:16:43.436 "num_base_bdevs": 3, 00:16:43.436 "num_base_bdevs_discovered": 2, 00:16:43.436 "num_base_bdevs_operational": 2, 00:16:43.436 "base_bdevs_list": [ 00:16:43.436 { 00:16:43.436 "name": null, 00:16:43.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.436 "is_configured": false, 00:16:43.436 "data_offset": 2048, 00:16:43.436 "data_size": 63488 00:16:43.436 }, 00:16:43.436 { 00:16:43.436 "name": "pt2", 00:16:43.436 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:43.436 "is_configured": true, 00:16:43.436 "data_offset": 2048, 00:16:43.436 "data_size": 63488 00:16:43.436 }, 00:16:43.436 { 00:16:43.436 "name": "pt3", 00:16:43.436 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:43.436 "is_configured": true, 00:16:43.436 "data_offset": 2048, 00:16:43.436 "data_size": 63488 00:16:43.436 } 00:16:43.436 ] 00:16:43.436 }' 00:16:43.436 13:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.436 13:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.000 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:44.257 [2024-07-15 13:35:23.508303] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:44.257 [2024-07-15 13:35:23.508333] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:44.257 [2024-07-15 13:35:23.508392] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:44.257 [2024-07-15 13:35:23.508445] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:44.257 [2024-07-15 13:35:23.508456] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c77cc0 name raid_bdev1, state offline 00:16:44.257 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.257 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:16:44.515 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:16:44.515 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:16:44.515 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:16:44.515 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:16:44.515 13:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:44.773 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:45.031 [2024-07-15 13:35:24.242203] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:45.031 [2024-07-15 13:35:24.242249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:45.031 [2024-07-15 13:35:24.242266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad5ec0 00:16:45.031 [2024-07-15 13:35:24.242279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:45.031 [2024-07-15 13:35:24.243881] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:45.031 [2024-07-15 13:35:24.243909] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:45.031 [2024-07-15 13:35:24.243984] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:45.031 [2024-07-15 13:35:24.244010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:45.031 [2024-07-15 13:35:24.244108] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:16:45.031 [2024-07-15 13:35:24.244131] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:45.031 [2024-07-15 13:35:24.244146] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c77f40 name raid_bdev1, state configuring 00:16:45.031 [2024-07-15 13:35:24.244170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:45.031 pt1 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.031 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:45.289 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.289 "name": "raid_bdev1", 00:16:45.289 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:45.289 "strip_size_kb": 0, 00:16:45.289 "state": "configuring", 00:16:45.289 "raid_level": "raid1", 00:16:45.289 "superblock": true, 00:16:45.289 "num_base_bdevs": 3, 00:16:45.289 "num_base_bdevs_discovered": 1, 00:16:45.289 "num_base_bdevs_operational": 2, 00:16:45.289 "base_bdevs_list": [ 00:16:45.289 { 00:16:45.289 "name": null, 00:16:45.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.289 "is_configured": false, 00:16:45.289 "data_offset": 2048, 00:16:45.289 "data_size": 63488 00:16:45.289 }, 00:16:45.289 { 00:16:45.289 "name": "pt2", 00:16:45.289 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:45.289 "is_configured": true, 00:16:45.289 "data_offset": 2048, 00:16:45.289 "data_size": 63488 00:16:45.289 }, 00:16:45.289 { 00:16:45.289 "name": null, 00:16:45.289 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:45.289 "is_configured": false, 00:16:45.289 "data_offset": 2048, 00:16:45.289 "data_size": 63488 00:16:45.289 } 00:16:45.289 ] 00:16:45.289 }' 00:16:45.289 13:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.289 13:35:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.855 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:45.855 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:16:46.113 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:16:46.113 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:46.372 [2024-07-15 13:35:25.577759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:46.372 [2024-07-15 13:35:25.577816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:46.372 [2024-07-15 13:35:25.577836] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad90c0 00:16:46.372 [2024-07-15 13:35:25.577849] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:46.372 [2024-07-15 13:35:25.578231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:46.372 [2024-07-15 13:35:25.578254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:46.372 [2024-07-15 13:35:25.578328] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:46.372 [2024-07-15 13:35:25.578348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:46.372 [2024-07-15 13:35:25.578457] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad9a40 00:16:46.372 [2024-07-15 13:35:25.578468] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:46.372 [2024-07-15 13:35:25.578643] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c786c0 00:16:46.372 [2024-07-15 13:35:25.578775] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad9a40 00:16:46.372 [2024-07-15 13:35:25.578785] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad9a40 00:16:46.372 [2024-07-15 13:35:25.578885] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:46.372 pt3 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.372 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:46.630 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.630 "name": "raid_bdev1", 00:16:46.630 "uuid": "fb8eeab9-bc4f-4ba6-8051-cf34db0184b4", 00:16:46.630 "strip_size_kb": 0, 00:16:46.630 "state": "online", 00:16:46.630 "raid_level": "raid1", 00:16:46.630 "superblock": true, 00:16:46.630 "num_base_bdevs": 3, 00:16:46.630 "num_base_bdevs_discovered": 2, 00:16:46.630 "num_base_bdevs_operational": 2, 00:16:46.630 "base_bdevs_list": [ 00:16:46.630 { 00:16:46.630 "name": null, 00:16:46.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.630 "is_configured": false, 00:16:46.630 "data_offset": 2048, 00:16:46.630 "data_size": 63488 00:16:46.630 }, 00:16:46.630 { 00:16:46.630 "name": "pt2", 00:16:46.630 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:46.630 "is_configured": true, 00:16:46.630 "data_offset": 2048, 00:16:46.630 "data_size": 63488 00:16:46.630 }, 00:16:46.630 { 00:16:46.630 "name": "pt3", 00:16:46.630 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:46.630 "is_configured": true, 00:16:46.630 "data_offset": 2048, 00:16:46.630 "data_size": 63488 00:16:46.630 } 00:16:46.630 ] 00:16:46.630 }' 00:16:46.630 13:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.630 13:35:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.195 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:16:47.195 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:47.452 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:16:47.452 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:47.452 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:16:47.710 [2024-07-15 13:35:26.905556] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 '!=' fb8eeab9-bc4f-4ba6-8051-cf34db0184b4 ']' 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2122479 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2122479 ']' 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2122479 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2122479 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2122479' 00:16:47.710 killing process with pid 2122479 00:16:47.710 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2122479 00:16:47.710 [2024-07-15 13:35:26.975518] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:47.710 [2024-07-15 13:35:26.975576] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:47.710 [2024-07-15 13:35:26.975634] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:47.711 [2024-07-15 13:35:26.975646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad9a40 name raid_bdev1, state offline 00:16:47.711 13:35:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2122479 00:16:47.711 [2024-07-15 13:35:27.006509] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:47.969 13:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:47.969 00:16:47.969 real 0m21.531s 00:16:47.969 user 0m39.327s 00:16:47.969 sys 0m3.931s 00:16:47.969 13:35:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:47.969 13:35:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.969 ************************************ 00:16:47.969 END TEST raid_superblock_test 00:16:47.969 ************************************ 00:16:47.969 13:35:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:47.969 13:35:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:16:47.969 13:35:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:47.969 13:35:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:47.969 13:35:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:47.969 ************************************ 00:16:47.969 START TEST raid_read_error_test 00:16:47.969 ************************************ 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:47.969 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.UtbfsrfUA7 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2125693 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2125693 /var/tmp/spdk-raid.sock 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2125693 ']' 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:47.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:47.970 13:35:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.970 [2024-07-15 13:35:27.381603] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:16:47.970 [2024-07-15 13:35:27.381671] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2125693 ] 00:16:48.267 [2024-07-15 13:35:27.513204] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:48.267 [2024-07-15 13:35:27.615991] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:48.267 [2024-07-15 13:35:27.675159] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:48.267 [2024-07-15 13:35:27.675191] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:49.225 13:35:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:49.225 13:35:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:49.225 13:35:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:49.225 13:35:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:49.225 BaseBdev1_malloc 00:16:49.225 13:35:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:49.484 true 00:16:49.484 13:35:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:49.741 [2024-07-15 13:35:28.986177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:49.741 [2024-07-15 13:35:28.986228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:49.741 [2024-07-15 13:35:28.986250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8f0d0 00:16:49.741 [2024-07-15 13:35:28.986263] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:49.741 [2024-07-15 13:35:28.988165] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:49.741 [2024-07-15 13:35:28.988196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:49.741 BaseBdev1 00:16:49.741 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:49.741 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:49.999 BaseBdev2_malloc 00:16:49.999 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:50.258 true 00:16:50.258 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:50.516 [2024-07-15 13:35:29.716724] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:50.516 [2024-07-15 13:35:29.716767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:50.516 [2024-07-15 13:35:29.716789] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e93910 00:16:50.516 [2024-07-15 13:35:29.716802] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:50.516 [2024-07-15 13:35:29.718414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:50.516 [2024-07-15 13:35:29.718442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:50.516 BaseBdev2 00:16:50.516 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:50.516 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:50.774 BaseBdev3_malloc 00:16:50.774 13:35:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:51.032 true 00:16:51.032 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:51.032 [2024-07-15 13:35:30.456545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:51.032 [2024-07-15 13:35:30.456592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:51.032 [2024-07-15 13:35:30.456621] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e95bd0 00:16:51.032 [2024-07-15 13:35:30.456640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:51.289 [2024-07-15 13:35:30.458299] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:51.289 [2024-07-15 13:35:30.458329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:51.289 BaseBdev3 00:16:51.289 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:51.289 [2024-07-15 13:35:30.705211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:51.289 [2024-07-15 13:35:30.706433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:51.289 [2024-07-15 13:35:30.706503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:51.289 [2024-07-15 13:35:30.706718] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e97280 00:16:51.289 [2024-07-15 13:35:30.706735] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:51.289 [2024-07-15 13:35:30.706919] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e96e20 00:16:51.289 [2024-07-15 13:35:30.707078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e97280 00:16:51.290 [2024-07-15 13:35:30.707089] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e97280 00:16:51.290 [2024-07-15 13:35:30.707196] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.547 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:51.805 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.805 "name": "raid_bdev1", 00:16:51.805 "uuid": "0e1e24af-eb3b-4554-a1ac-460c6764dccd", 00:16:51.805 "strip_size_kb": 0, 00:16:51.805 "state": "online", 00:16:51.805 "raid_level": "raid1", 00:16:51.805 "superblock": true, 00:16:51.805 "num_base_bdevs": 3, 00:16:51.805 "num_base_bdevs_discovered": 3, 00:16:51.805 "num_base_bdevs_operational": 3, 00:16:51.805 "base_bdevs_list": [ 00:16:51.805 { 00:16:51.805 "name": "BaseBdev1", 00:16:51.805 "uuid": "b12f0987-da97-5107-bd32-4474828001af", 00:16:51.805 "is_configured": true, 00:16:51.805 "data_offset": 2048, 00:16:51.805 "data_size": 63488 00:16:51.805 }, 00:16:51.805 { 00:16:51.805 "name": "BaseBdev2", 00:16:51.805 "uuid": "f074a8a4-5afa-5db8-b765-20a2389557e3", 00:16:51.805 "is_configured": true, 00:16:51.805 "data_offset": 2048, 00:16:51.805 "data_size": 63488 00:16:51.805 }, 00:16:51.805 { 00:16:51.805 "name": "BaseBdev3", 00:16:51.805 "uuid": "373bea65-4b32-5e6c-b8c7-87d1b39056a2", 00:16:51.805 "is_configured": true, 00:16:51.805 "data_offset": 2048, 00:16:51.805 "data_size": 63488 00:16:51.805 } 00:16:51.805 ] 00:16:51.805 }' 00:16:51.805 13:35:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.805 13:35:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.369 13:35:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:52.369 13:35:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:52.369 [2024-07-15 13:35:31.672078] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ce4e00 00:16:53.310 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.567 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.568 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:53.568 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.825 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.825 "name": "raid_bdev1", 00:16:53.825 "uuid": "0e1e24af-eb3b-4554-a1ac-460c6764dccd", 00:16:53.825 "strip_size_kb": 0, 00:16:53.825 "state": "online", 00:16:53.825 "raid_level": "raid1", 00:16:53.825 "superblock": true, 00:16:53.825 "num_base_bdevs": 3, 00:16:53.825 "num_base_bdevs_discovered": 3, 00:16:53.825 "num_base_bdevs_operational": 3, 00:16:53.825 "base_bdevs_list": [ 00:16:53.825 { 00:16:53.825 "name": "BaseBdev1", 00:16:53.825 "uuid": "b12f0987-da97-5107-bd32-4474828001af", 00:16:53.825 "is_configured": true, 00:16:53.825 "data_offset": 2048, 00:16:53.825 "data_size": 63488 00:16:53.825 }, 00:16:53.825 { 00:16:53.825 "name": "BaseBdev2", 00:16:53.825 "uuid": "f074a8a4-5afa-5db8-b765-20a2389557e3", 00:16:53.825 "is_configured": true, 00:16:53.825 "data_offset": 2048, 00:16:53.825 "data_size": 63488 00:16:53.825 }, 00:16:53.825 { 00:16:53.825 "name": "BaseBdev3", 00:16:53.825 "uuid": "373bea65-4b32-5e6c-b8c7-87d1b39056a2", 00:16:53.825 "is_configured": true, 00:16:53.825 "data_offset": 2048, 00:16:53.825 "data_size": 63488 00:16:53.825 } 00:16:53.825 ] 00:16:53.825 }' 00:16:53.825 13:35:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.826 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.390 13:35:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:54.647 [2024-07-15 13:35:33.842134] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:54.647 [2024-07-15 13:35:33.842169] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:54.647 [2024-07-15 13:35:33.845383] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:54.647 [2024-07-15 13:35:33.845417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:54.647 [2024-07-15 13:35:33.845514] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:54.647 [2024-07-15 13:35:33.845526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e97280 name raid_bdev1, state offline 00:16:54.647 0 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2125693 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2125693 ']' 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2125693 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:54.647 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2125693 00:16:54.648 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:54.648 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:54.648 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2125693' 00:16:54.648 killing process with pid 2125693 00:16:54.648 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2125693 00:16:54.648 [2024-07-15 13:35:33.912426] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:54.648 13:35:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2125693 00:16:54.648 [2024-07-15 13:35:33.933841] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.UtbfsrfUA7 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:54.906 00:16:54.906 real 0m6.871s 00:16:54.906 user 0m10.836s 00:16:54.906 sys 0m1.221s 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:54.906 13:35:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.906 ************************************ 00:16:54.906 END TEST raid_read_error_test 00:16:54.906 ************************************ 00:16:54.906 13:35:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:54.906 13:35:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:16:54.906 13:35:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:54.906 13:35:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:54.906 13:35:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:54.906 ************************************ 00:16:54.906 START TEST raid_write_error_test 00:16:54.906 ************************************ 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.iiOGjnG7Pt 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2126761 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2126761 /var/tmp/spdk-raid.sock 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2126761 ']' 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:54.906 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:54.906 13:35:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.163 [2024-07-15 13:35:34.332751] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:16:55.163 [2024-07-15 13:35:34.332817] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2126761 ] 00:16:55.163 [2024-07-15 13:35:34.463040] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:55.163 [2024-07-15 13:35:34.569251] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:55.421 [2024-07-15 13:35:34.640837] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.421 [2024-07-15 13:35:34.640875] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.987 13:35:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:55.987 13:35:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:55.987 13:35:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:55.987 13:35:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:56.244 BaseBdev1_malloc 00:16:56.244 13:35:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:56.502 true 00:16:56.502 13:35:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:56.759 [2024-07-15 13:35:35.995647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:56.759 [2024-07-15 13:35:35.995693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:56.759 [2024-07-15 13:35:35.995715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc340d0 00:16:56.759 [2024-07-15 13:35:35.995728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:56.759 [2024-07-15 13:35:35.997642] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:56.759 [2024-07-15 13:35:35.997673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:56.759 BaseBdev1 00:16:56.759 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:56.759 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:57.017 BaseBdev2_malloc 00:16:57.017 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:57.274 true 00:16:57.274 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:57.531 [2024-07-15 13:35:36.751506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:57.531 [2024-07-15 13:35:36.751550] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.531 [2024-07-15 13:35:36.751575] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc38910 00:16:57.531 [2024-07-15 13:35:36.751588] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.531 [2024-07-15 13:35:36.753182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.531 [2024-07-15 13:35:36.753210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:57.531 BaseBdev2 00:16:57.531 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:57.531 13:35:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:57.788 BaseBdev3_malloc 00:16:57.788 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:58.046 true 00:16:58.046 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:58.315 [2024-07-15 13:35:37.498445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:58.315 [2024-07-15 13:35:37.498491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.315 [2024-07-15 13:35:37.498513] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc3abd0 00:16:58.315 [2024-07-15 13:35:37.498526] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.315 [2024-07-15 13:35:37.500115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.315 [2024-07-15 13:35:37.500143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:58.315 BaseBdev3 00:16:58.315 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:58.315 [2024-07-15 13:35:37.731082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:58.315 [2024-07-15 13:35:37.732418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:58.315 [2024-07-15 13:35:37.732487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:58.315 [2024-07-15 13:35:37.732702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc3c280 00:16:58.315 [2024-07-15 13:35:37.732714] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:58.315 [2024-07-15 13:35:37.732912] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc3be20 00:16:58.315 [2024-07-15 13:35:37.733076] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc3c280 00:16:58.315 [2024-07-15 13:35:37.733086] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc3c280 00:16:58.315 [2024-07-15 13:35:37.733193] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.573 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.573 "name": "raid_bdev1", 00:16:58.573 "uuid": "126cd2fd-0fec-4524-99e4-8688241344df", 00:16:58.573 "strip_size_kb": 0, 00:16:58.573 "state": "online", 00:16:58.573 "raid_level": "raid1", 00:16:58.573 "superblock": true, 00:16:58.573 "num_base_bdevs": 3, 00:16:58.573 "num_base_bdevs_discovered": 3, 00:16:58.573 "num_base_bdevs_operational": 3, 00:16:58.573 "base_bdevs_list": [ 00:16:58.573 { 00:16:58.573 "name": "BaseBdev1", 00:16:58.573 "uuid": "4864f062-63be-546a-98f1-646cc1969325", 00:16:58.573 "is_configured": true, 00:16:58.573 "data_offset": 2048, 00:16:58.573 "data_size": 63488 00:16:58.573 }, 00:16:58.573 { 00:16:58.573 "name": "BaseBdev2", 00:16:58.573 "uuid": "f6454b50-800b-52af-9dbc-fb599d8316d3", 00:16:58.573 "is_configured": true, 00:16:58.573 "data_offset": 2048, 00:16:58.574 "data_size": 63488 00:16:58.574 }, 00:16:58.574 { 00:16:58.574 "name": "BaseBdev3", 00:16:58.574 "uuid": "cb480e98-4ab0-5224-969a-904196182048", 00:16:58.574 "is_configured": true, 00:16:58.574 "data_offset": 2048, 00:16:58.574 "data_size": 63488 00:16:58.574 } 00:16:58.574 ] 00:16:58.574 }' 00:16:58.574 13:35:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.574 13:35:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.137 13:35:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:59.137 13:35:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:59.395 [2024-07-15 13:35:38.661843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa89e00 00:17:00.329 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:00.585 [2024-07-15 13:35:39.789373] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:00.585 [2024-07-15 13:35:39.789428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:00.585 [2024-07-15 13:35:39.789624] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa89e00 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.585 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.586 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.586 13:35:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.863 13:35:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.863 "name": "raid_bdev1", 00:17:00.863 "uuid": "126cd2fd-0fec-4524-99e4-8688241344df", 00:17:00.863 "strip_size_kb": 0, 00:17:00.863 "state": "online", 00:17:00.863 "raid_level": "raid1", 00:17:00.863 "superblock": true, 00:17:00.863 "num_base_bdevs": 3, 00:17:00.863 "num_base_bdevs_discovered": 2, 00:17:00.863 "num_base_bdevs_operational": 2, 00:17:00.863 "base_bdevs_list": [ 00:17:00.863 { 00:17:00.863 "name": null, 00:17:00.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.863 "is_configured": false, 00:17:00.863 "data_offset": 2048, 00:17:00.863 "data_size": 63488 00:17:00.863 }, 00:17:00.863 { 00:17:00.863 "name": "BaseBdev2", 00:17:00.863 "uuid": "f6454b50-800b-52af-9dbc-fb599d8316d3", 00:17:00.863 "is_configured": true, 00:17:00.863 "data_offset": 2048, 00:17:00.863 "data_size": 63488 00:17:00.863 }, 00:17:00.863 { 00:17:00.863 "name": "BaseBdev3", 00:17:00.863 "uuid": "cb480e98-4ab0-5224-969a-904196182048", 00:17:00.863 "is_configured": true, 00:17:00.863 "data_offset": 2048, 00:17:00.863 "data_size": 63488 00:17:00.863 } 00:17:00.863 ] 00:17:00.863 }' 00:17:00.863 13:35:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.863 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.426 13:35:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:01.684 [2024-07-15 13:35:40.880609] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:01.684 [2024-07-15 13:35:40.880653] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.684 [2024-07-15 13:35:40.883878] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.684 [2024-07-15 13:35:40.883908] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:01.684 [2024-07-15 13:35:40.883991] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.684 [2024-07-15 13:35:40.884003] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc3c280 name raid_bdev1, state offline 00:17:01.684 0 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2126761 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2126761 ']' 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2126761 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2126761 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2126761' 00:17:01.684 killing process with pid 2126761 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2126761 00:17:01.684 [2024-07-15 13:35:40.948654] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:01.684 13:35:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2126761 00:17:01.684 [2024-07-15 13:35:40.971040] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.iiOGjnG7Pt 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:01.942 00:17:01.942 real 0m6.950s 00:17:01.942 user 0m11.009s 00:17:01.942 sys 0m1.219s 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:01.942 13:35:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.942 ************************************ 00:17:01.942 END TEST raid_write_error_test 00:17:01.942 ************************************ 00:17:01.942 13:35:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:01.942 13:35:41 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:01.942 13:35:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:01.942 13:35:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:01.942 13:35:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:01.942 13:35:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:01.942 13:35:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:01.942 ************************************ 00:17:01.942 START TEST raid_state_function_test 00:17:01.942 ************************************ 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:01.942 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2127802 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2127802' 00:17:01.943 Process raid pid: 2127802 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2127802 /var/tmp/spdk-raid.sock 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2127802 ']' 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:01.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:01.943 13:35:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.943 [2024-07-15 13:35:41.340519] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:17:01.943 [2024-07-15 13:35:41.340583] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:02.208 [2024-07-15 13:35:41.473331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:02.208 [2024-07-15 13:35:41.579039] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:02.498 [2024-07-15 13:35:41.646870] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:02.498 [2024-07-15 13:35:41.646904] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:03.094 [2024-07-15 13:35:42.489871] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:03.094 [2024-07-15 13:35:42.489914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:03.094 [2024-07-15 13:35:42.489933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:03.094 [2024-07-15 13:35:42.489945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:03.094 [2024-07-15 13:35:42.489954] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:03.094 [2024-07-15 13:35:42.489967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:03.094 [2024-07-15 13:35:42.489976] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:03.094 [2024-07-15 13:35:42.489987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.094 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.351 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.351 "name": "Existed_Raid", 00:17:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.351 "strip_size_kb": 64, 00:17:03.351 "state": "configuring", 00:17:03.351 "raid_level": "raid0", 00:17:03.351 "superblock": false, 00:17:03.351 "num_base_bdevs": 4, 00:17:03.351 "num_base_bdevs_discovered": 0, 00:17:03.351 "num_base_bdevs_operational": 4, 00:17:03.351 "base_bdevs_list": [ 00:17:03.351 { 00:17:03.351 "name": "BaseBdev1", 00:17:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.351 "is_configured": false, 00:17:03.351 "data_offset": 0, 00:17:03.351 "data_size": 0 00:17:03.351 }, 00:17:03.351 { 00:17:03.351 "name": "BaseBdev2", 00:17:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.351 "is_configured": false, 00:17:03.351 "data_offset": 0, 00:17:03.351 "data_size": 0 00:17:03.351 }, 00:17:03.351 { 00:17:03.351 "name": "BaseBdev3", 00:17:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.351 "is_configured": false, 00:17:03.351 "data_offset": 0, 00:17:03.351 "data_size": 0 00:17:03.351 }, 00:17:03.351 { 00:17:03.351 "name": "BaseBdev4", 00:17:03.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.351 "is_configured": false, 00:17:03.351 "data_offset": 0, 00:17:03.351 "data_size": 0 00:17:03.351 } 00:17:03.351 ] 00:17:03.351 }' 00:17:03.351 13:35:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.351 13:35:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.917 13:35:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:04.175 [2024-07-15 13:35:43.468343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:04.175 [2024-07-15 13:35:43.468370] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b34aa0 name Existed_Raid, state configuring 00:17:04.175 13:35:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:04.433 [2024-07-15 13:35:43.713013] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:04.433 [2024-07-15 13:35:43.713043] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:04.433 [2024-07-15 13:35:43.713053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:04.433 [2024-07-15 13:35:43.713065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:04.433 [2024-07-15 13:35:43.713073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:04.433 [2024-07-15 13:35:43.713084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:04.433 [2024-07-15 13:35:43.713093] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:04.433 [2024-07-15 13:35:43.713104] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:04.433 13:35:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:04.691 [2024-07-15 13:35:43.975620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:04.691 BaseBdev1 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:04.691 13:35:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:04.950 13:35:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:05.208 [ 00:17:05.208 { 00:17:05.208 "name": "BaseBdev1", 00:17:05.208 "aliases": [ 00:17:05.208 "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c" 00:17:05.208 ], 00:17:05.208 "product_name": "Malloc disk", 00:17:05.208 "block_size": 512, 00:17:05.208 "num_blocks": 65536, 00:17:05.208 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:05.208 "assigned_rate_limits": { 00:17:05.208 "rw_ios_per_sec": 0, 00:17:05.208 "rw_mbytes_per_sec": 0, 00:17:05.208 "r_mbytes_per_sec": 0, 00:17:05.208 "w_mbytes_per_sec": 0 00:17:05.208 }, 00:17:05.208 "claimed": true, 00:17:05.208 "claim_type": "exclusive_write", 00:17:05.208 "zoned": false, 00:17:05.208 "supported_io_types": { 00:17:05.208 "read": true, 00:17:05.208 "write": true, 00:17:05.208 "unmap": true, 00:17:05.208 "flush": true, 00:17:05.208 "reset": true, 00:17:05.208 "nvme_admin": false, 00:17:05.208 "nvme_io": false, 00:17:05.208 "nvme_io_md": false, 00:17:05.208 "write_zeroes": true, 00:17:05.208 "zcopy": true, 00:17:05.208 "get_zone_info": false, 00:17:05.208 "zone_management": false, 00:17:05.208 "zone_append": false, 00:17:05.208 "compare": false, 00:17:05.208 "compare_and_write": false, 00:17:05.208 "abort": true, 00:17:05.208 "seek_hole": false, 00:17:05.208 "seek_data": false, 00:17:05.208 "copy": true, 00:17:05.208 "nvme_iov_md": false 00:17:05.209 }, 00:17:05.209 "memory_domains": [ 00:17:05.209 { 00:17:05.209 "dma_device_id": "system", 00:17:05.209 "dma_device_type": 1 00:17:05.209 }, 00:17:05.209 { 00:17:05.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.209 "dma_device_type": 2 00:17:05.209 } 00:17:05.209 ], 00:17:05.209 "driver_specific": {} 00:17:05.209 } 00:17:05.209 ] 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.209 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.498 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.498 "name": "Existed_Raid", 00:17:05.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.498 "strip_size_kb": 64, 00:17:05.498 "state": "configuring", 00:17:05.498 "raid_level": "raid0", 00:17:05.498 "superblock": false, 00:17:05.498 "num_base_bdevs": 4, 00:17:05.498 "num_base_bdevs_discovered": 1, 00:17:05.498 "num_base_bdevs_operational": 4, 00:17:05.498 "base_bdevs_list": [ 00:17:05.498 { 00:17:05.498 "name": "BaseBdev1", 00:17:05.498 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:05.499 "is_configured": true, 00:17:05.499 "data_offset": 0, 00:17:05.499 "data_size": 65536 00:17:05.499 }, 00:17:05.499 { 00:17:05.499 "name": "BaseBdev2", 00:17:05.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.499 "is_configured": false, 00:17:05.499 "data_offset": 0, 00:17:05.499 "data_size": 0 00:17:05.499 }, 00:17:05.499 { 00:17:05.499 "name": "BaseBdev3", 00:17:05.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.499 "is_configured": false, 00:17:05.499 "data_offset": 0, 00:17:05.499 "data_size": 0 00:17:05.499 }, 00:17:05.499 { 00:17:05.499 "name": "BaseBdev4", 00:17:05.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.499 "is_configured": false, 00:17:05.499 "data_offset": 0, 00:17:05.499 "data_size": 0 00:17:05.499 } 00:17:05.499 ] 00:17:05.499 }' 00:17:05.499 13:35:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.499 13:35:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.064 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:06.323 [2024-07-15 13:35:45.503668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:06.323 [2024-07-15 13:35:45.503706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b34310 name Existed_Raid, state configuring 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:06.323 [2024-07-15 13:35:45.672151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:06.323 [2024-07-15 13:35:45.673585] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:06.323 [2024-07-15 13:35:45.673615] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:06.323 [2024-07-15 13:35:45.673625] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:06.323 [2024-07-15 13:35:45.673637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:06.323 [2024-07-15 13:35:45.673646] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:06.323 [2024-07-15 13:35:45.673657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.323 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.581 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.581 "name": "Existed_Raid", 00:17:06.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.581 "strip_size_kb": 64, 00:17:06.581 "state": "configuring", 00:17:06.581 "raid_level": "raid0", 00:17:06.581 "superblock": false, 00:17:06.581 "num_base_bdevs": 4, 00:17:06.581 "num_base_bdevs_discovered": 1, 00:17:06.581 "num_base_bdevs_operational": 4, 00:17:06.581 "base_bdevs_list": [ 00:17:06.581 { 00:17:06.581 "name": "BaseBdev1", 00:17:06.581 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:06.581 "is_configured": true, 00:17:06.581 "data_offset": 0, 00:17:06.581 "data_size": 65536 00:17:06.581 }, 00:17:06.581 { 00:17:06.581 "name": "BaseBdev2", 00:17:06.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.581 "is_configured": false, 00:17:06.581 "data_offset": 0, 00:17:06.581 "data_size": 0 00:17:06.581 }, 00:17:06.581 { 00:17:06.581 "name": "BaseBdev3", 00:17:06.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.581 "is_configured": false, 00:17:06.581 "data_offset": 0, 00:17:06.581 "data_size": 0 00:17:06.581 }, 00:17:06.581 { 00:17:06.581 "name": "BaseBdev4", 00:17:06.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.581 "is_configured": false, 00:17:06.581 "data_offset": 0, 00:17:06.581 "data_size": 0 00:17:06.581 } 00:17:06.581 ] 00:17:06.581 }' 00:17:06.581 13:35:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.581 13:35:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.147 13:35:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:07.405 [2024-07-15 13:35:46.694311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:07.405 BaseBdev2 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:07.405 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.663 13:35:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:07.921 [ 00:17:07.921 { 00:17:07.921 "name": "BaseBdev2", 00:17:07.921 "aliases": [ 00:17:07.921 "aa72a065-9e32-470f-9153-faaabc4364f9" 00:17:07.921 ], 00:17:07.921 "product_name": "Malloc disk", 00:17:07.921 "block_size": 512, 00:17:07.921 "num_blocks": 65536, 00:17:07.921 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:07.921 "assigned_rate_limits": { 00:17:07.921 "rw_ios_per_sec": 0, 00:17:07.921 "rw_mbytes_per_sec": 0, 00:17:07.921 "r_mbytes_per_sec": 0, 00:17:07.921 "w_mbytes_per_sec": 0 00:17:07.921 }, 00:17:07.921 "claimed": true, 00:17:07.921 "claim_type": "exclusive_write", 00:17:07.921 "zoned": false, 00:17:07.921 "supported_io_types": { 00:17:07.921 "read": true, 00:17:07.921 "write": true, 00:17:07.921 "unmap": true, 00:17:07.921 "flush": true, 00:17:07.921 "reset": true, 00:17:07.921 "nvme_admin": false, 00:17:07.921 "nvme_io": false, 00:17:07.921 "nvme_io_md": false, 00:17:07.921 "write_zeroes": true, 00:17:07.921 "zcopy": true, 00:17:07.921 "get_zone_info": false, 00:17:07.921 "zone_management": false, 00:17:07.921 "zone_append": false, 00:17:07.921 "compare": false, 00:17:07.921 "compare_and_write": false, 00:17:07.921 "abort": true, 00:17:07.921 "seek_hole": false, 00:17:07.921 "seek_data": false, 00:17:07.921 "copy": true, 00:17:07.921 "nvme_iov_md": false 00:17:07.921 }, 00:17:07.921 "memory_domains": [ 00:17:07.921 { 00:17:07.921 "dma_device_id": "system", 00:17:07.921 "dma_device_type": 1 00:17:07.921 }, 00:17:07.921 { 00:17:07.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.921 "dma_device_type": 2 00:17:07.921 } 00:17:07.921 ], 00:17:07.921 "driver_specific": {} 00:17:07.921 } 00:17:07.921 ] 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.921 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.179 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.179 "name": "Existed_Raid", 00:17:08.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.179 "strip_size_kb": 64, 00:17:08.179 "state": "configuring", 00:17:08.179 "raid_level": "raid0", 00:17:08.179 "superblock": false, 00:17:08.179 "num_base_bdevs": 4, 00:17:08.179 "num_base_bdevs_discovered": 2, 00:17:08.179 "num_base_bdevs_operational": 4, 00:17:08.179 "base_bdevs_list": [ 00:17:08.179 { 00:17:08.179 "name": "BaseBdev1", 00:17:08.179 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:08.179 "is_configured": true, 00:17:08.179 "data_offset": 0, 00:17:08.179 "data_size": 65536 00:17:08.179 }, 00:17:08.179 { 00:17:08.179 "name": "BaseBdev2", 00:17:08.179 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:08.179 "is_configured": true, 00:17:08.179 "data_offset": 0, 00:17:08.179 "data_size": 65536 00:17:08.179 }, 00:17:08.179 { 00:17:08.179 "name": "BaseBdev3", 00:17:08.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.179 "is_configured": false, 00:17:08.179 "data_offset": 0, 00:17:08.179 "data_size": 0 00:17:08.179 }, 00:17:08.179 { 00:17:08.179 "name": "BaseBdev4", 00:17:08.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.179 "is_configured": false, 00:17:08.179 "data_offset": 0, 00:17:08.179 "data_size": 0 00:17:08.179 } 00:17:08.179 ] 00:17:08.179 }' 00:17:08.179 13:35:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.179 13:35:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.744 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:09.002 [2024-07-15 13:35:48.277890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:09.002 BaseBdev3 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:09.002 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.261 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:09.519 [ 00:17:09.519 { 00:17:09.519 "name": "BaseBdev3", 00:17:09.519 "aliases": [ 00:17:09.519 "cc3f84e6-ad3d-48ec-8612-80668d0da701" 00:17:09.519 ], 00:17:09.519 "product_name": "Malloc disk", 00:17:09.519 "block_size": 512, 00:17:09.519 "num_blocks": 65536, 00:17:09.519 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:09.519 "assigned_rate_limits": { 00:17:09.519 "rw_ios_per_sec": 0, 00:17:09.519 "rw_mbytes_per_sec": 0, 00:17:09.519 "r_mbytes_per_sec": 0, 00:17:09.519 "w_mbytes_per_sec": 0 00:17:09.519 }, 00:17:09.519 "claimed": true, 00:17:09.519 "claim_type": "exclusive_write", 00:17:09.519 "zoned": false, 00:17:09.519 "supported_io_types": { 00:17:09.519 "read": true, 00:17:09.519 "write": true, 00:17:09.519 "unmap": true, 00:17:09.519 "flush": true, 00:17:09.519 "reset": true, 00:17:09.519 "nvme_admin": false, 00:17:09.519 "nvme_io": false, 00:17:09.519 "nvme_io_md": false, 00:17:09.519 "write_zeroes": true, 00:17:09.519 "zcopy": true, 00:17:09.519 "get_zone_info": false, 00:17:09.519 "zone_management": false, 00:17:09.519 "zone_append": false, 00:17:09.519 "compare": false, 00:17:09.519 "compare_and_write": false, 00:17:09.519 "abort": true, 00:17:09.519 "seek_hole": false, 00:17:09.519 "seek_data": false, 00:17:09.519 "copy": true, 00:17:09.519 "nvme_iov_md": false 00:17:09.519 }, 00:17:09.519 "memory_domains": [ 00:17:09.519 { 00:17:09.519 "dma_device_id": "system", 00:17:09.519 "dma_device_type": 1 00:17:09.519 }, 00:17:09.519 { 00:17:09.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.519 "dma_device_type": 2 00:17:09.519 } 00:17:09.519 ], 00:17:09.519 "driver_specific": {} 00:17:09.519 } 00:17:09.519 ] 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:09.519 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.520 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.778 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.778 "name": "Existed_Raid", 00:17:09.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.778 "strip_size_kb": 64, 00:17:09.778 "state": "configuring", 00:17:09.778 "raid_level": "raid0", 00:17:09.778 "superblock": false, 00:17:09.778 "num_base_bdevs": 4, 00:17:09.778 "num_base_bdevs_discovered": 3, 00:17:09.778 "num_base_bdevs_operational": 4, 00:17:09.778 "base_bdevs_list": [ 00:17:09.778 { 00:17:09.778 "name": "BaseBdev1", 00:17:09.778 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:09.778 "is_configured": true, 00:17:09.778 "data_offset": 0, 00:17:09.778 "data_size": 65536 00:17:09.778 }, 00:17:09.778 { 00:17:09.778 "name": "BaseBdev2", 00:17:09.778 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:09.778 "is_configured": true, 00:17:09.778 "data_offset": 0, 00:17:09.778 "data_size": 65536 00:17:09.778 }, 00:17:09.778 { 00:17:09.778 "name": "BaseBdev3", 00:17:09.778 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:09.778 "is_configured": true, 00:17:09.778 "data_offset": 0, 00:17:09.778 "data_size": 65536 00:17:09.778 }, 00:17:09.778 { 00:17:09.778 "name": "BaseBdev4", 00:17:09.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.778 "is_configured": false, 00:17:09.778 "data_offset": 0, 00:17:09.778 "data_size": 0 00:17:09.778 } 00:17:09.778 ] 00:17:09.778 }' 00:17:09.778 13:35:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.778 13:35:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.343 13:35:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:10.600 [2024-07-15 13:35:49.785325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:10.600 [2024-07-15 13:35:49.785361] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b35350 00:17:10.600 [2024-07-15 13:35:49.785369] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:10.600 [2024-07-15 13:35:49.785619] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b35020 00:17:10.600 [2024-07-15 13:35:49.785740] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b35350 00:17:10.600 [2024-07-15 13:35:49.785750] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b35350 00:17:10.600 [2024-07-15 13:35:49.785912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.600 BaseBdev4 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:10.600 13:35:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.858 13:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:10.858 [ 00:17:10.858 { 00:17:10.858 "name": "BaseBdev4", 00:17:10.858 "aliases": [ 00:17:10.858 "359ec75e-3762-42aa-8f7f-a5e20c17faef" 00:17:10.858 ], 00:17:10.858 "product_name": "Malloc disk", 00:17:10.858 "block_size": 512, 00:17:10.858 "num_blocks": 65536, 00:17:10.858 "uuid": "359ec75e-3762-42aa-8f7f-a5e20c17faef", 00:17:10.858 "assigned_rate_limits": { 00:17:10.858 "rw_ios_per_sec": 0, 00:17:10.858 "rw_mbytes_per_sec": 0, 00:17:10.858 "r_mbytes_per_sec": 0, 00:17:10.858 "w_mbytes_per_sec": 0 00:17:10.858 }, 00:17:10.858 "claimed": true, 00:17:10.858 "claim_type": "exclusive_write", 00:17:10.858 "zoned": false, 00:17:10.858 "supported_io_types": { 00:17:10.858 "read": true, 00:17:10.858 "write": true, 00:17:10.858 "unmap": true, 00:17:10.858 "flush": true, 00:17:10.858 "reset": true, 00:17:10.858 "nvme_admin": false, 00:17:10.858 "nvme_io": false, 00:17:10.858 "nvme_io_md": false, 00:17:10.858 "write_zeroes": true, 00:17:10.858 "zcopy": true, 00:17:10.858 "get_zone_info": false, 00:17:10.858 "zone_management": false, 00:17:10.858 "zone_append": false, 00:17:10.858 "compare": false, 00:17:10.858 "compare_and_write": false, 00:17:10.858 "abort": true, 00:17:10.858 "seek_hole": false, 00:17:10.858 "seek_data": false, 00:17:10.858 "copy": true, 00:17:10.858 "nvme_iov_md": false 00:17:10.858 }, 00:17:10.858 "memory_domains": [ 00:17:10.858 { 00:17:10.858 "dma_device_id": "system", 00:17:10.858 "dma_device_type": 1 00:17:10.858 }, 00:17:10.858 { 00:17:10.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.858 "dma_device_type": 2 00:17:10.858 } 00:17:10.858 ], 00:17:10.858 "driver_specific": {} 00:17:10.858 } 00:17:10.858 ] 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.116 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.373 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.373 "name": "Existed_Raid", 00:17:11.374 "uuid": "85622e16-0807-4a88-9449-c01780a9a839", 00:17:11.374 "strip_size_kb": 64, 00:17:11.374 "state": "online", 00:17:11.374 "raid_level": "raid0", 00:17:11.374 "superblock": false, 00:17:11.374 "num_base_bdevs": 4, 00:17:11.374 "num_base_bdevs_discovered": 4, 00:17:11.374 "num_base_bdevs_operational": 4, 00:17:11.374 "base_bdevs_list": [ 00:17:11.374 { 00:17:11.374 "name": "BaseBdev1", 00:17:11.374 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:11.374 "is_configured": true, 00:17:11.374 "data_offset": 0, 00:17:11.374 "data_size": 65536 00:17:11.374 }, 00:17:11.374 { 00:17:11.374 "name": "BaseBdev2", 00:17:11.374 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:11.374 "is_configured": true, 00:17:11.374 "data_offset": 0, 00:17:11.374 "data_size": 65536 00:17:11.374 }, 00:17:11.374 { 00:17:11.374 "name": "BaseBdev3", 00:17:11.374 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:11.374 "is_configured": true, 00:17:11.374 "data_offset": 0, 00:17:11.374 "data_size": 65536 00:17:11.374 }, 00:17:11.374 { 00:17:11.374 "name": "BaseBdev4", 00:17:11.374 "uuid": "359ec75e-3762-42aa-8f7f-a5e20c17faef", 00:17:11.374 "is_configured": true, 00:17:11.374 "data_offset": 0, 00:17:11.374 "data_size": 65536 00:17:11.374 } 00:17:11.374 ] 00:17:11.374 }' 00:17:11.374 13:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.374 13:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:11.936 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:12.193 [2024-07-15 13:35:51.365847] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.193 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:12.193 "name": "Existed_Raid", 00:17:12.193 "aliases": [ 00:17:12.193 "85622e16-0807-4a88-9449-c01780a9a839" 00:17:12.193 ], 00:17:12.193 "product_name": "Raid Volume", 00:17:12.193 "block_size": 512, 00:17:12.193 "num_blocks": 262144, 00:17:12.193 "uuid": "85622e16-0807-4a88-9449-c01780a9a839", 00:17:12.193 "assigned_rate_limits": { 00:17:12.193 "rw_ios_per_sec": 0, 00:17:12.193 "rw_mbytes_per_sec": 0, 00:17:12.193 "r_mbytes_per_sec": 0, 00:17:12.193 "w_mbytes_per_sec": 0 00:17:12.193 }, 00:17:12.193 "claimed": false, 00:17:12.193 "zoned": false, 00:17:12.193 "supported_io_types": { 00:17:12.193 "read": true, 00:17:12.193 "write": true, 00:17:12.193 "unmap": true, 00:17:12.193 "flush": true, 00:17:12.193 "reset": true, 00:17:12.193 "nvme_admin": false, 00:17:12.193 "nvme_io": false, 00:17:12.193 "nvme_io_md": false, 00:17:12.193 "write_zeroes": true, 00:17:12.193 "zcopy": false, 00:17:12.193 "get_zone_info": false, 00:17:12.193 "zone_management": false, 00:17:12.193 "zone_append": false, 00:17:12.193 "compare": false, 00:17:12.193 "compare_and_write": false, 00:17:12.193 "abort": false, 00:17:12.193 "seek_hole": false, 00:17:12.193 "seek_data": false, 00:17:12.193 "copy": false, 00:17:12.193 "nvme_iov_md": false 00:17:12.193 }, 00:17:12.193 "memory_domains": [ 00:17:12.193 { 00:17:12.193 "dma_device_id": "system", 00:17:12.193 "dma_device_type": 1 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.193 "dma_device_type": 2 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "system", 00:17:12.193 "dma_device_type": 1 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.193 "dma_device_type": 2 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "system", 00:17:12.193 "dma_device_type": 1 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.193 "dma_device_type": 2 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "system", 00:17:12.193 "dma_device_type": 1 00:17:12.193 }, 00:17:12.193 { 00:17:12.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.193 "dma_device_type": 2 00:17:12.193 } 00:17:12.193 ], 00:17:12.193 "driver_specific": { 00:17:12.193 "raid": { 00:17:12.193 "uuid": "85622e16-0807-4a88-9449-c01780a9a839", 00:17:12.193 "strip_size_kb": 64, 00:17:12.193 "state": "online", 00:17:12.193 "raid_level": "raid0", 00:17:12.193 "superblock": false, 00:17:12.194 "num_base_bdevs": 4, 00:17:12.194 "num_base_bdevs_discovered": 4, 00:17:12.194 "num_base_bdevs_operational": 4, 00:17:12.194 "base_bdevs_list": [ 00:17:12.194 { 00:17:12.194 "name": "BaseBdev1", 00:17:12.194 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:12.194 "is_configured": true, 00:17:12.194 "data_offset": 0, 00:17:12.194 "data_size": 65536 00:17:12.194 }, 00:17:12.194 { 00:17:12.194 "name": "BaseBdev2", 00:17:12.194 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:12.194 "is_configured": true, 00:17:12.194 "data_offset": 0, 00:17:12.194 "data_size": 65536 00:17:12.194 }, 00:17:12.194 { 00:17:12.194 "name": "BaseBdev3", 00:17:12.194 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:12.194 "is_configured": true, 00:17:12.194 "data_offset": 0, 00:17:12.194 "data_size": 65536 00:17:12.194 }, 00:17:12.194 { 00:17:12.194 "name": "BaseBdev4", 00:17:12.194 "uuid": "359ec75e-3762-42aa-8f7f-a5e20c17faef", 00:17:12.194 "is_configured": true, 00:17:12.194 "data_offset": 0, 00:17:12.194 "data_size": 65536 00:17:12.194 } 00:17:12.194 ] 00:17:12.194 } 00:17:12.194 } 00:17:12.194 }' 00:17:12.194 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:12.194 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:12.194 BaseBdev2 00:17:12.194 BaseBdev3 00:17:12.194 BaseBdev4' 00:17:12.194 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.194 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:12.194 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.450 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.450 "name": "BaseBdev1", 00:17:12.450 "aliases": [ 00:17:12.450 "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c" 00:17:12.450 ], 00:17:12.450 "product_name": "Malloc disk", 00:17:12.450 "block_size": 512, 00:17:12.450 "num_blocks": 65536, 00:17:12.450 "uuid": "3ba693e5-d42f-4204-aae2-fd4fcfb2f65c", 00:17:12.450 "assigned_rate_limits": { 00:17:12.450 "rw_ios_per_sec": 0, 00:17:12.450 "rw_mbytes_per_sec": 0, 00:17:12.450 "r_mbytes_per_sec": 0, 00:17:12.450 "w_mbytes_per_sec": 0 00:17:12.450 }, 00:17:12.450 "claimed": true, 00:17:12.450 "claim_type": "exclusive_write", 00:17:12.450 "zoned": false, 00:17:12.450 "supported_io_types": { 00:17:12.450 "read": true, 00:17:12.450 "write": true, 00:17:12.450 "unmap": true, 00:17:12.450 "flush": true, 00:17:12.450 "reset": true, 00:17:12.450 "nvme_admin": false, 00:17:12.450 "nvme_io": false, 00:17:12.450 "nvme_io_md": false, 00:17:12.450 "write_zeroes": true, 00:17:12.450 "zcopy": true, 00:17:12.450 "get_zone_info": false, 00:17:12.450 "zone_management": false, 00:17:12.450 "zone_append": false, 00:17:12.450 "compare": false, 00:17:12.450 "compare_and_write": false, 00:17:12.450 "abort": true, 00:17:12.450 "seek_hole": false, 00:17:12.450 "seek_data": false, 00:17:12.451 "copy": true, 00:17:12.451 "nvme_iov_md": false 00:17:12.451 }, 00:17:12.451 "memory_domains": [ 00:17:12.451 { 00:17:12.451 "dma_device_id": "system", 00:17:12.451 "dma_device_type": 1 00:17:12.451 }, 00:17:12.451 { 00:17:12.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.451 "dma_device_type": 2 00:17:12.451 } 00:17:12.451 ], 00:17:12.451 "driver_specific": {} 00:17:12.451 }' 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.451 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:12.708 13:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.965 "name": "BaseBdev2", 00:17:12.965 "aliases": [ 00:17:12.965 "aa72a065-9e32-470f-9153-faaabc4364f9" 00:17:12.965 ], 00:17:12.965 "product_name": "Malloc disk", 00:17:12.965 "block_size": 512, 00:17:12.965 "num_blocks": 65536, 00:17:12.965 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:12.965 "assigned_rate_limits": { 00:17:12.965 "rw_ios_per_sec": 0, 00:17:12.965 "rw_mbytes_per_sec": 0, 00:17:12.965 "r_mbytes_per_sec": 0, 00:17:12.965 "w_mbytes_per_sec": 0 00:17:12.965 }, 00:17:12.965 "claimed": true, 00:17:12.965 "claim_type": "exclusive_write", 00:17:12.965 "zoned": false, 00:17:12.965 "supported_io_types": { 00:17:12.965 "read": true, 00:17:12.965 "write": true, 00:17:12.965 "unmap": true, 00:17:12.965 "flush": true, 00:17:12.965 "reset": true, 00:17:12.965 "nvme_admin": false, 00:17:12.965 "nvme_io": false, 00:17:12.965 "nvme_io_md": false, 00:17:12.965 "write_zeroes": true, 00:17:12.965 "zcopy": true, 00:17:12.965 "get_zone_info": false, 00:17:12.965 "zone_management": false, 00:17:12.965 "zone_append": false, 00:17:12.965 "compare": false, 00:17:12.965 "compare_and_write": false, 00:17:12.965 "abort": true, 00:17:12.965 "seek_hole": false, 00:17:12.965 "seek_data": false, 00:17:12.965 "copy": true, 00:17:12.965 "nvme_iov_md": false 00:17:12.965 }, 00:17:12.965 "memory_domains": [ 00:17:12.965 { 00:17:12.965 "dma_device_id": "system", 00:17:12.965 "dma_device_type": 1 00:17:12.965 }, 00:17:12.965 { 00:17:12.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.965 "dma_device_type": 2 00:17:12.965 } 00:17:12.965 ], 00:17:12.965 "driver_specific": {} 00:17:12.965 }' 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.965 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:13.222 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:13.480 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:13.480 "name": "BaseBdev3", 00:17:13.480 "aliases": [ 00:17:13.480 "cc3f84e6-ad3d-48ec-8612-80668d0da701" 00:17:13.480 ], 00:17:13.480 "product_name": "Malloc disk", 00:17:13.480 "block_size": 512, 00:17:13.480 "num_blocks": 65536, 00:17:13.480 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:13.480 "assigned_rate_limits": { 00:17:13.480 "rw_ios_per_sec": 0, 00:17:13.480 "rw_mbytes_per_sec": 0, 00:17:13.480 "r_mbytes_per_sec": 0, 00:17:13.480 "w_mbytes_per_sec": 0 00:17:13.480 }, 00:17:13.480 "claimed": true, 00:17:13.480 "claim_type": "exclusive_write", 00:17:13.480 "zoned": false, 00:17:13.480 "supported_io_types": { 00:17:13.480 "read": true, 00:17:13.480 "write": true, 00:17:13.480 "unmap": true, 00:17:13.480 "flush": true, 00:17:13.480 "reset": true, 00:17:13.480 "nvme_admin": false, 00:17:13.480 "nvme_io": false, 00:17:13.480 "nvme_io_md": false, 00:17:13.480 "write_zeroes": true, 00:17:13.480 "zcopy": true, 00:17:13.480 "get_zone_info": false, 00:17:13.480 "zone_management": false, 00:17:13.480 "zone_append": false, 00:17:13.480 "compare": false, 00:17:13.480 "compare_and_write": false, 00:17:13.480 "abort": true, 00:17:13.480 "seek_hole": false, 00:17:13.480 "seek_data": false, 00:17:13.480 "copy": true, 00:17:13.480 "nvme_iov_md": false 00:17:13.480 }, 00:17:13.480 "memory_domains": [ 00:17:13.480 { 00:17:13.480 "dma_device_id": "system", 00:17:13.480 "dma_device_type": 1 00:17:13.480 }, 00:17:13.480 { 00:17:13.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.480 "dma_device_type": 2 00:17:13.480 } 00:17:13.480 ], 00:17:13.480 "driver_specific": {} 00:17:13.480 }' 00:17:13.480 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.737 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.737 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:13.737 13:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.737 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.737 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.737 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.737 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:13.993 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:14.250 "name": "BaseBdev4", 00:17:14.250 "aliases": [ 00:17:14.250 "359ec75e-3762-42aa-8f7f-a5e20c17faef" 00:17:14.250 ], 00:17:14.250 "product_name": "Malloc disk", 00:17:14.250 "block_size": 512, 00:17:14.250 "num_blocks": 65536, 00:17:14.250 "uuid": "359ec75e-3762-42aa-8f7f-a5e20c17faef", 00:17:14.250 "assigned_rate_limits": { 00:17:14.250 "rw_ios_per_sec": 0, 00:17:14.250 "rw_mbytes_per_sec": 0, 00:17:14.250 "r_mbytes_per_sec": 0, 00:17:14.250 "w_mbytes_per_sec": 0 00:17:14.250 }, 00:17:14.250 "claimed": true, 00:17:14.250 "claim_type": "exclusive_write", 00:17:14.250 "zoned": false, 00:17:14.250 "supported_io_types": { 00:17:14.250 "read": true, 00:17:14.250 "write": true, 00:17:14.250 "unmap": true, 00:17:14.250 "flush": true, 00:17:14.250 "reset": true, 00:17:14.250 "nvme_admin": false, 00:17:14.250 "nvme_io": false, 00:17:14.250 "nvme_io_md": false, 00:17:14.250 "write_zeroes": true, 00:17:14.250 "zcopy": true, 00:17:14.250 "get_zone_info": false, 00:17:14.250 "zone_management": false, 00:17:14.250 "zone_append": false, 00:17:14.250 "compare": false, 00:17:14.250 "compare_and_write": false, 00:17:14.250 "abort": true, 00:17:14.250 "seek_hole": false, 00:17:14.250 "seek_data": false, 00:17:14.250 "copy": true, 00:17:14.250 "nvme_iov_md": false 00:17:14.250 }, 00:17:14.250 "memory_domains": [ 00:17:14.250 { 00:17:14.250 "dma_device_id": "system", 00:17:14.250 "dma_device_type": 1 00:17:14.250 }, 00:17:14.250 { 00:17:14.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.250 "dma_device_type": 2 00:17:14.250 } 00:17:14.250 ], 00:17:14.250 "driver_specific": {} 00:17:14.250 }' 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:14.250 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.506 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.507 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.507 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.507 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.507 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.507 13:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:14.764 [2024-07-15 13:35:54.080794] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:14.764 [2024-07-15 13:35:54.080822] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:14.764 [2024-07-15 13:35:54.080873] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:14.764 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:14.764 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:14.764 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:14.764 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.765 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.023 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.023 "name": "Existed_Raid", 00:17:15.023 "uuid": "85622e16-0807-4a88-9449-c01780a9a839", 00:17:15.023 "strip_size_kb": 64, 00:17:15.023 "state": "offline", 00:17:15.023 "raid_level": "raid0", 00:17:15.023 "superblock": false, 00:17:15.023 "num_base_bdevs": 4, 00:17:15.023 "num_base_bdevs_discovered": 3, 00:17:15.023 "num_base_bdevs_operational": 3, 00:17:15.023 "base_bdevs_list": [ 00:17:15.023 { 00:17:15.023 "name": null, 00:17:15.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.023 "is_configured": false, 00:17:15.023 "data_offset": 0, 00:17:15.023 "data_size": 65536 00:17:15.023 }, 00:17:15.023 { 00:17:15.023 "name": "BaseBdev2", 00:17:15.023 "uuid": "aa72a065-9e32-470f-9153-faaabc4364f9", 00:17:15.023 "is_configured": true, 00:17:15.023 "data_offset": 0, 00:17:15.023 "data_size": 65536 00:17:15.023 }, 00:17:15.023 { 00:17:15.023 "name": "BaseBdev3", 00:17:15.023 "uuid": "cc3f84e6-ad3d-48ec-8612-80668d0da701", 00:17:15.023 "is_configured": true, 00:17:15.023 "data_offset": 0, 00:17:15.023 "data_size": 65536 00:17:15.023 }, 00:17:15.023 { 00:17:15.023 "name": "BaseBdev4", 00:17:15.023 "uuid": "359ec75e-3762-42aa-8f7f-a5e20c17faef", 00:17:15.023 "is_configured": true, 00:17:15.023 "data_offset": 0, 00:17:15.023 "data_size": 65536 00:17:15.023 } 00:17:15.023 ] 00:17:15.023 }' 00:17:15.023 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.023 13:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.588 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:15.588 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:15.588 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.588 13:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:15.846 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:15.846 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:15.846 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:16.104 [2024-07-15 13:35:55.426324] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:16.104 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:16.104 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:16.104 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:16.104 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.369 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:16.369 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:16.369 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:16.636 [2024-07-15 13:35:55.934258] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:16.636 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:16.636 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:16.636 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.636 13:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:17.202 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:17.202 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:17.202 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:17.460 [2024-07-15 13:35:56.696830] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:17.460 [2024-07-15 13:35:56.696876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b35350 name Existed_Raid, state offline 00:17:17.460 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:17.460 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:17.460 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.460 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:17.718 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:17.718 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:17.718 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:17.718 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:17.718 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:17.719 13:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:17.977 BaseBdev2 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:17.977 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:18.236 [ 00:17:18.236 { 00:17:18.236 "name": "BaseBdev2", 00:17:18.236 "aliases": [ 00:17:18.236 "e1ac618c-ddb1-4367-9852-cb4d41946ccc" 00:17:18.236 ], 00:17:18.236 "product_name": "Malloc disk", 00:17:18.236 "block_size": 512, 00:17:18.236 "num_blocks": 65536, 00:17:18.236 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:18.236 "assigned_rate_limits": { 00:17:18.236 "rw_ios_per_sec": 0, 00:17:18.236 "rw_mbytes_per_sec": 0, 00:17:18.236 "r_mbytes_per_sec": 0, 00:17:18.236 "w_mbytes_per_sec": 0 00:17:18.236 }, 00:17:18.236 "claimed": false, 00:17:18.236 "zoned": false, 00:17:18.236 "supported_io_types": { 00:17:18.236 "read": true, 00:17:18.236 "write": true, 00:17:18.236 "unmap": true, 00:17:18.236 "flush": true, 00:17:18.236 "reset": true, 00:17:18.236 "nvme_admin": false, 00:17:18.236 "nvme_io": false, 00:17:18.236 "nvme_io_md": false, 00:17:18.236 "write_zeroes": true, 00:17:18.236 "zcopy": true, 00:17:18.236 "get_zone_info": false, 00:17:18.236 "zone_management": false, 00:17:18.236 "zone_append": false, 00:17:18.236 "compare": false, 00:17:18.236 "compare_and_write": false, 00:17:18.236 "abort": true, 00:17:18.236 "seek_hole": false, 00:17:18.236 "seek_data": false, 00:17:18.236 "copy": true, 00:17:18.236 "nvme_iov_md": false 00:17:18.236 }, 00:17:18.236 "memory_domains": [ 00:17:18.236 { 00:17:18.236 "dma_device_id": "system", 00:17:18.236 "dma_device_type": 1 00:17:18.236 }, 00:17:18.236 { 00:17:18.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.236 "dma_device_type": 2 00:17:18.236 } 00:17:18.236 ], 00:17:18.236 "driver_specific": {} 00:17:18.236 } 00:17:18.236 ] 00:17:18.236 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:18.236 13:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:18.236 13:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:18.236 13:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:18.494 BaseBdev3 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:18.494 13:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:18.753 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:19.011 [ 00:17:19.011 { 00:17:19.011 "name": "BaseBdev3", 00:17:19.011 "aliases": [ 00:17:19.011 "a8df66d8-f3b1-48b6-844b-6678ddb89217" 00:17:19.011 ], 00:17:19.011 "product_name": "Malloc disk", 00:17:19.011 "block_size": 512, 00:17:19.011 "num_blocks": 65536, 00:17:19.011 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:19.011 "assigned_rate_limits": { 00:17:19.011 "rw_ios_per_sec": 0, 00:17:19.011 "rw_mbytes_per_sec": 0, 00:17:19.011 "r_mbytes_per_sec": 0, 00:17:19.011 "w_mbytes_per_sec": 0 00:17:19.011 }, 00:17:19.011 "claimed": false, 00:17:19.011 "zoned": false, 00:17:19.011 "supported_io_types": { 00:17:19.011 "read": true, 00:17:19.011 "write": true, 00:17:19.011 "unmap": true, 00:17:19.011 "flush": true, 00:17:19.011 "reset": true, 00:17:19.011 "nvme_admin": false, 00:17:19.011 "nvme_io": false, 00:17:19.011 "nvme_io_md": false, 00:17:19.011 "write_zeroes": true, 00:17:19.011 "zcopy": true, 00:17:19.011 "get_zone_info": false, 00:17:19.011 "zone_management": false, 00:17:19.011 "zone_append": false, 00:17:19.011 "compare": false, 00:17:19.011 "compare_and_write": false, 00:17:19.011 "abort": true, 00:17:19.011 "seek_hole": false, 00:17:19.011 "seek_data": false, 00:17:19.011 "copy": true, 00:17:19.011 "nvme_iov_md": false 00:17:19.011 }, 00:17:19.011 "memory_domains": [ 00:17:19.011 { 00:17:19.011 "dma_device_id": "system", 00:17:19.011 "dma_device_type": 1 00:17:19.011 }, 00:17:19.011 { 00:17:19.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.011 "dma_device_type": 2 00:17:19.011 } 00:17:19.011 ], 00:17:19.011 "driver_specific": {} 00:17:19.011 } 00:17:19.011 ] 00:17:19.012 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:19.012 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:19.012 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:19.012 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:19.270 BaseBdev4 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:19.270 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.528 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:19.528 [ 00:17:19.528 { 00:17:19.528 "name": "BaseBdev4", 00:17:19.528 "aliases": [ 00:17:19.528 "54d7e117-3a9c-4acb-9672-e6b507bc02e8" 00:17:19.528 ], 00:17:19.528 "product_name": "Malloc disk", 00:17:19.528 "block_size": 512, 00:17:19.528 "num_blocks": 65536, 00:17:19.528 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:19.528 "assigned_rate_limits": { 00:17:19.528 "rw_ios_per_sec": 0, 00:17:19.528 "rw_mbytes_per_sec": 0, 00:17:19.528 "r_mbytes_per_sec": 0, 00:17:19.528 "w_mbytes_per_sec": 0 00:17:19.528 }, 00:17:19.528 "claimed": false, 00:17:19.528 "zoned": false, 00:17:19.528 "supported_io_types": { 00:17:19.528 "read": true, 00:17:19.528 "write": true, 00:17:19.528 "unmap": true, 00:17:19.528 "flush": true, 00:17:19.528 "reset": true, 00:17:19.528 "nvme_admin": false, 00:17:19.528 "nvme_io": false, 00:17:19.528 "nvme_io_md": false, 00:17:19.528 "write_zeroes": true, 00:17:19.528 "zcopy": true, 00:17:19.528 "get_zone_info": false, 00:17:19.528 "zone_management": false, 00:17:19.528 "zone_append": false, 00:17:19.528 "compare": false, 00:17:19.528 "compare_and_write": false, 00:17:19.528 "abort": true, 00:17:19.528 "seek_hole": false, 00:17:19.528 "seek_data": false, 00:17:19.528 "copy": true, 00:17:19.528 "nvme_iov_md": false 00:17:19.528 }, 00:17:19.528 "memory_domains": [ 00:17:19.528 { 00:17:19.528 "dma_device_id": "system", 00:17:19.528 "dma_device_type": 1 00:17:19.528 }, 00:17:19.528 { 00:17:19.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.528 "dma_device_type": 2 00:17:19.528 } 00:17:19.528 ], 00:17:19.528 "driver_specific": {} 00:17:19.528 } 00:17:19.528 ] 00:17:19.528 13:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:19.528 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:19.528 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:19.528 13:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:19.787 [2024-07-15 13:35:59.160456] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:19.787 [2024-07-15 13:35:59.160498] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:19.787 [2024-07-15 13:35:59.160525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:19.787 [2024-07-15 13:35:59.161885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:19.787 [2024-07-15 13:35:59.161936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.787 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.045 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.045 "name": "Existed_Raid", 00:17:20.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.045 "strip_size_kb": 64, 00:17:20.045 "state": "configuring", 00:17:20.045 "raid_level": "raid0", 00:17:20.045 "superblock": false, 00:17:20.045 "num_base_bdevs": 4, 00:17:20.045 "num_base_bdevs_discovered": 3, 00:17:20.045 "num_base_bdevs_operational": 4, 00:17:20.045 "base_bdevs_list": [ 00:17:20.045 { 00:17:20.045 "name": "BaseBdev1", 00:17:20.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.045 "is_configured": false, 00:17:20.045 "data_offset": 0, 00:17:20.045 "data_size": 0 00:17:20.045 }, 00:17:20.045 { 00:17:20.045 "name": "BaseBdev2", 00:17:20.045 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:20.045 "is_configured": true, 00:17:20.045 "data_offset": 0, 00:17:20.045 "data_size": 65536 00:17:20.045 }, 00:17:20.045 { 00:17:20.045 "name": "BaseBdev3", 00:17:20.045 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:20.045 "is_configured": true, 00:17:20.045 "data_offset": 0, 00:17:20.045 "data_size": 65536 00:17:20.045 }, 00:17:20.045 { 00:17:20.045 "name": "BaseBdev4", 00:17:20.045 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:20.045 "is_configured": true, 00:17:20.045 "data_offset": 0, 00:17:20.045 "data_size": 65536 00:17:20.045 } 00:17:20.045 ] 00:17:20.045 }' 00:17:20.045 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.045 13:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.611 13:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:20.869 [2024-07-15 13:36:00.219330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.870 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.129 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.129 "name": "Existed_Raid", 00:17:21.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.129 "strip_size_kb": 64, 00:17:21.129 "state": "configuring", 00:17:21.129 "raid_level": "raid0", 00:17:21.129 "superblock": false, 00:17:21.129 "num_base_bdevs": 4, 00:17:21.129 "num_base_bdevs_discovered": 2, 00:17:21.129 "num_base_bdevs_operational": 4, 00:17:21.129 "base_bdevs_list": [ 00:17:21.129 { 00:17:21.129 "name": "BaseBdev1", 00:17:21.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.129 "is_configured": false, 00:17:21.129 "data_offset": 0, 00:17:21.129 "data_size": 0 00:17:21.129 }, 00:17:21.129 { 00:17:21.129 "name": null, 00:17:21.129 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:21.129 "is_configured": false, 00:17:21.129 "data_offset": 0, 00:17:21.129 "data_size": 65536 00:17:21.129 }, 00:17:21.129 { 00:17:21.129 "name": "BaseBdev3", 00:17:21.129 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:21.129 "is_configured": true, 00:17:21.129 "data_offset": 0, 00:17:21.129 "data_size": 65536 00:17:21.129 }, 00:17:21.129 { 00:17:21.129 "name": "BaseBdev4", 00:17:21.129 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:21.129 "is_configured": true, 00:17:21.129 "data_offset": 0, 00:17:21.129 "data_size": 65536 00:17:21.129 } 00:17:21.129 ] 00:17:21.129 }' 00:17:21.129 13:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.129 13:36:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.695 13:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.695 13:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:21.953 13:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:21.953 13:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:22.212 [2024-07-15 13:36:01.563300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:22.212 BaseBdev1 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:22.212 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.470 13:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:22.730 [ 00:17:22.730 { 00:17:22.730 "name": "BaseBdev1", 00:17:22.730 "aliases": [ 00:17:22.730 "5cfd0f54-111a-4365-a050-63a7c0e3b6f2" 00:17:22.730 ], 00:17:22.730 "product_name": "Malloc disk", 00:17:22.730 "block_size": 512, 00:17:22.730 "num_blocks": 65536, 00:17:22.730 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:22.730 "assigned_rate_limits": { 00:17:22.730 "rw_ios_per_sec": 0, 00:17:22.730 "rw_mbytes_per_sec": 0, 00:17:22.730 "r_mbytes_per_sec": 0, 00:17:22.730 "w_mbytes_per_sec": 0 00:17:22.730 }, 00:17:22.730 "claimed": true, 00:17:22.730 "claim_type": "exclusive_write", 00:17:22.730 "zoned": false, 00:17:22.730 "supported_io_types": { 00:17:22.730 "read": true, 00:17:22.730 "write": true, 00:17:22.730 "unmap": true, 00:17:22.730 "flush": true, 00:17:22.730 "reset": true, 00:17:22.730 "nvme_admin": false, 00:17:22.730 "nvme_io": false, 00:17:22.730 "nvme_io_md": false, 00:17:22.730 "write_zeroes": true, 00:17:22.730 "zcopy": true, 00:17:22.730 "get_zone_info": false, 00:17:22.730 "zone_management": false, 00:17:22.730 "zone_append": false, 00:17:22.730 "compare": false, 00:17:22.730 "compare_and_write": false, 00:17:22.730 "abort": true, 00:17:22.730 "seek_hole": false, 00:17:22.730 "seek_data": false, 00:17:22.730 "copy": true, 00:17:22.730 "nvme_iov_md": false 00:17:22.730 }, 00:17:22.730 "memory_domains": [ 00:17:22.730 { 00:17:22.730 "dma_device_id": "system", 00:17:22.730 "dma_device_type": 1 00:17:22.730 }, 00:17:22.730 { 00:17:22.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.730 "dma_device_type": 2 00:17:22.730 } 00:17:22.730 ], 00:17:22.730 "driver_specific": {} 00:17:22.730 } 00:17:22.730 ] 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.730 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.989 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.989 "name": "Existed_Raid", 00:17:22.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.989 "strip_size_kb": 64, 00:17:22.989 "state": "configuring", 00:17:22.989 "raid_level": "raid0", 00:17:22.989 "superblock": false, 00:17:22.989 "num_base_bdevs": 4, 00:17:22.989 "num_base_bdevs_discovered": 3, 00:17:22.989 "num_base_bdevs_operational": 4, 00:17:22.989 "base_bdevs_list": [ 00:17:22.989 { 00:17:22.989 "name": "BaseBdev1", 00:17:22.989 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:22.989 "is_configured": true, 00:17:22.989 "data_offset": 0, 00:17:22.989 "data_size": 65536 00:17:22.989 }, 00:17:22.989 { 00:17:22.989 "name": null, 00:17:22.989 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:22.989 "is_configured": false, 00:17:22.989 "data_offset": 0, 00:17:22.989 "data_size": 65536 00:17:22.989 }, 00:17:22.989 { 00:17:22.989 "name": "BaseBdev3", 00:17:22.989 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:22.989 "is_configured": true, 00:17:22.989 "data_offset": 0, 00:17:22.989 "data_size": 65536 00:17:22.989 }, 00:17:22.989 { 00:17:22.989 "name": "BaseBdev4", 00:17:22.989 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:22.989 "is_configured": true, 00:17:22.989 "data_offset": 0, 00:17:22.989 "data_size": 65536 00:17:22.989 } 00:17:22.989 ] 00:17:22.989 }' 00:17:22.989 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.989 13:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.556 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.556 13:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:23.815 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:23.815 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:24.073 [2024-07-15 13:36:03.295948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:24.073 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.074 "name": "Existed_Raid", 00:17:24.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.074 "strip_size_kb": 64, 00:17:24.074 "state": "configuring", 00:17:24.074 "raid_level": "raid0", 00:17:24.074 "superblock": false, 00:17:24.074 "num_base_bdevs": 4, 00:17:24.074 "num_base_bdevs_discovered": 2, 00:17:24.074 "num_base_bdevs_operational": 4, 00:17:24.074 "base_bdevs_list": [ 00:17:24.074 { 00:17:24.074 "name": "BaseBdev1", 00:17:24.074 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:24.074 "is_configured": true, 00:17:24.074 "data_offset": 0, 00:17:24.074 "data_size": 65536 00:17:24.074 }, 00:17:24.074 { 00:17:24.074 "name": null, 00:17:24.074 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:24.074 "is_configured": false, 00:17:24.074 "data_offset": 0, 00:17:24.074 "data_size": 65536 00:17:24.074 }, 00:17:24.074 { 00:17:24.074 "name": null, 00:17:24.074 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:24.074 "is_configured": false, 00:17:24.074 "data_offset": 0, 00:17:24.074 "data_size": 65536 00:17:24.074 }, 00:17:24.074 { 00:17:24.074 "name": "BaseBdev4", 00:17:24.074 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:24.074 "is_configured": true, 00:17:24.074 "data_offset": 0, 00:17:24.074 "data_size": 65536 00:17:24.074 } 00:17:24.074 ] 00:17:24.074 }' 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.074 13:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.642 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:24.642 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.901 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:24.901 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:25.160 [2024-07-15 13:36:04.463213] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.160 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.419 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.419 "name": "Existed_Raid", 00:17:25.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.419 "strip_size_kb": 64, 00:17:25.419 "state": "configuring", 00:17:25.419 "raid_level": "raid0", 00:17:25.419 "superblock": false, 00:17:25.419 "num_base_bdevs": 4, 00:17:25.419 "num_base_bdevs_discovered": 3, 00:17:25.419 "num_base_bdevs_operational": 4, 00:17:25.419 "base_bdevs_list": [ 00:17:25.419 { 00:17:25.419 "name": "BaseBdev1", 00:17:25.419 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:25.419 "is_configured": true, 00:17:25.419 "data_offset": 0, 00:17:25.419 "data_size": 65536 00:17:25.419 }, 00:17:25.419 { 00:17:25.419 "name": null, 00:17:25.419 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:25.419 "is_configured": false, 00:17:25.419 "data_offset": 0, 00:17:25.419 "data_size": 65536 00:17:25.419 }, 00:17:25.419 { 00:17:25.419 "name": "BaseBdev3", 00:17:25.419 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:25.419 "is_configured": true, 00:17:25.419 "data_offset": 0, 00:17:25.419 "data_size": 65536 00:17:25.419 }, 00:17:25.419 { 00:17:25.419 "name": "BaseBdev4", 00:17:25.419 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:25.419 "is_configured": true, 00:17:25.419 "data_offset": 0, 00:17:25.419 "data_size": 65536 00:17:25.419 } 00:17:25.419 ] 00:17:25.419 }' 00:17:25.419 13:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.419 13:36:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.986 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.986 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:26.245 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:26.245 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:26.503 [2024-07-15 13:36:05.802769] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.503 13:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.070 13:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.070 "name": "Existed_Raid", 00:17:27.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.070 "strip_size_kb": 64, 00:17:27.070 "state": "configuring", 00:17:27.070 "raid_level": "raid0", 00:17:27.070 "superblock": false, 00:17:27.070 "num_base_bdevs": 4, 00:17:27.070 "num_base_bdevs_discovered": 2, 00:17:27.070 "num_base_bdevs_operational": 4, 00:17:27.070 "base_bdevs_list": [ 00:17:27.070 { 00:17:27.070 "name": null, 00:17:27.070 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:27.070 "is_configured": false, 00:17:27.070 "data_offset": 0, 00:17:27.070 "data_size": 65536 00:17:27.070 }, 00:17:27.070 { 00:17:27.070 "name": null, 00:17:27.070 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:27.070 "is_configured": false, 00:17:27.070 "data_offset": 0, 00:17:27.070 "data_size": 65536 00:17:27.070 }, 00:17:27.070 { 00:17:27.070 "name": "BaseBdev3", 00:17:27.070 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:27.070 "is_configured": true, 00:17:27.070 "data_offset": 0, 00:17:27.070 "data_size": 65536 00:17:27.070 }, 00:17:27.070 { 00:17:27.070 "name": "BaseBdev4", 00:17:27.070 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:27.070 "is_configured": true, 00:17:27.070 "data_offset": 0, 00:17:27.070 "data_size": 65536 00:17:27.070 } 00:17:27.070 ] 00:17:27.070 }' 00:17:27.070 13:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.070 13:36:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.636 13:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.636 13:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:27.893 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:27.893 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:28.152 [2024-07-15 13:36:07.401649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.152 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.718 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.718 "name": "Existed_Raid", 00:17:28.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.718 "strip_size_kb": 64, 00:17:28.718 "state": "configuring", 00:17:28.718 "raid_level": "raid0", 00:17:28.718 "superblock": false, 00:17:28.718 "num_base_bdevs": 4, 00:17:28.718 "num_base_bdevs_discovered": 3, 00:17:28.718 "num_base_bdevs_operational": 4, 00:17:28.718 "base_bdevs_list": [ 00:17:28.718 { 00:17:28.718 "name": null, 00:17:28.718 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:28.718 "is_configured": false, 00:17:28.718 "data_offset": 0, 00:17:28.718 "data_size": 65536 00:17:28.718 }, 00:17:28.718 { 00:17:28.718 "name": "BaseBdev2", 00:17:28.718 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:28.718 "is_configured": true, 00:17:28.718 "data_offset": 0, 00:17:28.718 "data_size": 65536 00:17:28.718 }, 00:17:28.718 { 00:17:28.718 "name": "BaseBdev3", 00:17:28.718 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:28.718 "is_configured": true, 00:17:28.718 "data_offset": 0, 00:17:28.718 "data_size": 65536 00:17:28.718 }, 00:17:28.718 { 00:17:28.718 "name": "BaseBdev4", 00:17:28.718 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:28.718 "is_configured": true, 00:17:28.718 "data_offset": 0, 00:17:28.718 "data_size": 65536 00:17:28.718 } 00:17:28.718 ] 00:17:28.718 }' 00:17:28.718 13:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.718 13:36:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.284 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:29.284 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.543 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:29.543 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.543 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:29.543 13:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5cfd0f54-111a-4365-a050-63a7c0e3b6f2 00:17:30.111 [2024-07-15 13:36:09.450466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:30.111 [2024-07-15 13:36:09.450504] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b39040 00:17:30.111 [2024-07-15 13:36:09.450512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:30.111 [2024-07-15 13:36:09.450704] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b34a70 00:17:30.111 [2024-07-15 13:36:09.450818] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b39040 00:17:30.111 [2024-07-15 13:36:09.450828] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b39040 00:17:30.111 [2024-07-15 13:36:09.450999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:30.111 NewBaseBdev 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:30.111 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:30.370 13:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:31.006 [ 00:17:31.006 { 00:17:31.006 "name": "NewBaseBdev", 00:17:31.006 "aliases": [ 00:17:31.006 "5cfd0f54-111a-4365-a050-63a7c0e3b6f2" 00:17:31.006 ], 00:17:31.006 "product_name": "Malloc disk", 00:17:31.006 "block_size": 512, 00:17:31.006 "num_blocks": 65536, 00:17:31.006 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:31.006 "assigned_rate_limits": { 00:17:31.006 "rw_ios_per_sec": 0, 00:17:31.006 "rw_mbytes_per_sec": 0, 00:17:31.006 "r_mbytes_per_sec": 0, 00:17:31.006 "w_mbytes_per_sec": 0 00:17:31.006 }, 00:17:31.006 "claimed": true, 00:17:31.006 "claim_type": "exclusive_write", 00:17:31.006 "zoned": false, 00:17:31.006 "supported_io_types": { 00:17:31.006 "read": true, 00:17:31.006 "write": true, 00:17:31.006 "unmap": true, 00:17:31.006 "flush": true, 00:17:31.006 "reset": true, 00:17:31.006 "nvme_admin": false, 00:17:31.006 "nvme_io": false, 00:17:31.006 "nvme_io_md": false, 00:17:31.006 "write_zeroes": true, 00:17:31.006 "zcopy": true, 00:17:31.006 "get_zone_info": false, 00:17:31.006 "zone_management": false, 00:17:31.006 "zone_append": false, 00:17:31.006 "compare": false, 00:17:31.006 "compare_and_write": false, 00:17:31.006 "abort": true, 00:17:31.006 "seek_hole": false, 00:17:31.006 "seek_data": false, 00:17:31.006 "copy": true, 00:17:31.006 "nvme_iov_md": false 00:17:31.006 }, 00:17:31.006 "memory_domains": [ 00:17:31.006 { 00:17:31.006 "dma_device_id": "system", 00:17:31.006 "dma_device_type": 1 00:17:31.006 }, 00:17:31.006 { 00:17:31.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.007 "dma_device_type": 2 00:17:31.007 } 00:17:31.007 ], 00:17:31.007 "driver_specific": {} 00:17:31.007 } 00:17:31.007 ] 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.007 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.266 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.266 "name": "Existed_Raid", 00:17:31.266 "uuid": "3a7345e5-aae5-4013-a64c-07d40cf20f25", 00:17:31.266 "strip_size_kb": 64, 00:17:31.266 "state": "online", 00:17:31.266 "raid_level": "raid0", 00:17:31.266 "superblock": false, 00:17:31.266 "num_base_bdevs": 4, 00:17:31.266 "num_base_bdevs_discovered": 4, 00:17:31.266 "num_base_bdevs_operational": 4, 00:17:31.266 "base_bdevs_list": [ 00:17:31.266 { 00:17:31.266 "name": "NewBaseBdev", 00:17:31.266 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:31.266 "is_configured": true, 00:17:31.266 "data_offset": 0, 00:17:31.266 "data_size": 65536 00:17:31.266 }, 00:17:31.266 { 00:17:31.266 "name": "BaseBdev2", 00:17:31.266 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:31.266 "is_configured": true, 00:17:31.266 "data_offset": 0, 00:17:31.266 "data_size": 65536 00:17:31.266 }, 00:17:31.266 { 00:17:31.266 "name": "BaseBdev3", 00:17:31.266 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:31.266 "is_configured": true, 00:17:31.266 "data_offset": 0, 00:17:31.266 "data_size": 65536 00:17:31.266 }, 00:17:31.266 { 00:17:31.266 "name": "BaseBdev4", 00:17:31.266 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:31.266 "is_configured": true, 00:17:31.266 "data_offset": 0, 00:17:31.266 "data_size": 65536 00:17:31.266 } 00:17:31.266 ] 00:17:31.266 }' 00:17:31.266 13:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.266 13:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.203 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:32.203 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:32.203 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:32.203 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:32.204 [2024-07-15 13:36:11.560384] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:32.204 "name": "Existed_Raid", 00:17:32.204 "aliases": [ 00:17:32.204 "3a7345e5-aae5-4013-a64c-07d40cf20f25" 00:17:32.204 ], 00:17:32.204 "product_name": "Raid Volume", 00:17:32.204 "block_size": 512, 00:17:32.204 "num_blocks": 262144, 00:17:32.204 "uuid": "3a7345e5-aae5-4013-a64c-07d40cf20f25", 00:17:32.204 "assigned_rate_limits": { 00:17:32.204 "rw_ios_per_sec": 0, 00:17:32.204 "rw_mbytes_per_sec": 0, 00:17:32.204 "r_mbytes_per_sec": 0, 00:17:32.204 "w_mbytes_per_sec": 0 00:17:32.204 }, 00:17:32.204 "claimed": false, 00:17:32.204 "zoned": false, 00:17:32.204 "supported_io_types": { 00:17:32.204 "read": true, 00:17:32.204 "write": true, 00:17:32.204 "unmap": true, 00:17:32.204 "flush": true, 00:17:32.204 "reset": true, 00:17:32.204 "nvme_admin": false, 00:17:32.204 "nvme_io": false, 00:17:32.204 "nvme_io_md": false, 00:17:32.204 "write_zeroes": true, 00:17:32.204 "zcopy": false, 00:17:32.204 "get_zone_info": false, 00:17:32.204 "zone_management": false, 00:17:32.204 "zone_append": false, 00:17:32.204 "compare": false, 00:17:32.204 "compare_and_write": false, 00:17:32.204 "abort": false, 00:17:32.204 "seek_hole": false, 00:17:32.204 "seek_data": false, 00:17:32.204 "copy": false, 00:17:32.204 "nvme_iov_md": false 00:17:32.204 }, 00:17:32.204 "memory_domains": [ 00:17:32.204 { 00:17:32.204 "dma_device_id": "system", 00:17:32.204 "dma_device_type": 1 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.204 "dma_device_type": 2 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "system", 00:17:32.204 "dma_device_type": 1 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.204 "dma_device_type": 2 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "system", 00:17:32.204 "dma_device_type": 1 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.204 "dma_device_type": 2 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "system", 00:17:32.204 "dma_device_type": 1 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.204 "dma_device_type": 2 00:17:32.204 } 00:17:32.204 ], 00:17:32.204 "driver_specific": { 00:17:32.204 "raid": { 00:17:32.204 "uuid": "3a7345e5-aae5-4013-a64c-07d40cf20f25", 00:17:32.204 "strip_size_kb": 64, 00:17:32.204 "state": "online", 00:17:32.204 "raid_level": "raid0", 00:17:32.204 "superblock": false, 00:17:32.204 "num_base_bdevs": 4, 00:17:32.204 "num_base_bdevs_discovered": 4, 00:17:32.204 "num_base_bdevs_operational": 4, 00:17:32.204 "base_bdevs_list": [ 00:17:32.204 { 00:17:32.204 "name": "NewBaseBdev", 00:17:32.204 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:32.204 "is_configured": true, 00:17:32.204 "data_offset": 0, 00:17:32.204 "data_size": 65536 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "name": "BaseBdev2", 00:17:32.204 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:32.204 "is_configured": true, 00:17:32.204 "data_offset": 0, 00:17:32.204 "data_size": 65536 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "name": "BaseBdev3", 00:17:32.204 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:32.204 "is_configured": true, 00:17:32.204 "data_offset": 0, 00:17:32.204 "data_size": 65536 00:17:32.204 }, 00:17:32.204 { 00:17:32.204 "name": "BaseBdev4", 00:17:32.204 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:32.204 "is_configured": true, 00:17:32.204 "data_offset": 0, 00:17:32.204 "data_size": 65536 00:17:32.204 } 00:17:32.204 ] 00:17:32.204 } 00:17:32.204 } 00:17:32.204 }' 00:17:32.204 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:32.462 BaseBdev2 00:17:32.462 BaseBdev3 00:17:32.462 BaseBdev4' 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.462 "name": "NewBaseBdev", 00:17:32.462 "aliases": [ 00:17:32.462 "5cfd0f54-111a-4365-a050-63a7c0e3b6f2" 00:17:32.462 ], 00:17:32.462 "product_name": "Malloc disk", 00:17:32.462 "block_size": 512, 00:17:32.462 "num_blocks": 65536, 00:17:32.462 "uuid": "5cfd0f54-111a-4365-a050-63a7c0e3b6f2", 00:17:32.462 "assigned_rate_limits": { 00:17:32.462 "rw_ios_per_sec": 0, 00:17:32.462 "rw_mbytes_per_sec": 0, 00:17:32.462 "r_mbytes_per_sec": 0, 00:17:32.462 "w_mbytes_per_sec": 0 00:17:32.462 }, 00:17:32.462 "claimed": true, 00:17:32.462 "claim_type": "exclusive_write", 00:17:32.462 "zoned": false, 00:17:32.462 "supported_io_types": { 00:17:32.462 "read": true, 00:17:32.462 "write": true, 00:17:32.462 "unmap": true, 00:17:32.462 "flush": true, 00:17:32.462 "reset": true, 00:17:32.462 "nvme_admin": false, 00:17:32.462 "nvme_io": false, 00:17:32.462 "nvme_io_md": false, 00:17:32.462 "write_zeroes": true, 00:17:32.462 "zcopy": true, 00:17:32.462 "get_zone_info": false, 00:17:32.462 "zone_management": false, 00:17:32.462 "zone_append": false, 00:17:32.462 "compare": false, 00:17:32.462 "compare_and_write": false, 00:17:32.462 "abort": true, 00:17:32.462 "seek_hole": false, 00:17:32.462 "seek_data": false, 00:17:32.462 "copy": true, 00:17:32.462 "nvme_iov_md": false 00:17:32.462 }, 00:17:32.462 "memory_domains": [ 00:17:32.462 { 00:17:32.462 "dma_device_id": "system", 00:17:32.462 "dma_device_type": 1 00:17:32.462 }, 00:17:32.462 { 00:17:32.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.462 "dma_device_type": 2 00:17:32.462 } 00:17:32.462 ], 00:17:32.462 "driver_specific": {} 00:17:32.462 }' 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.462 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.721 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.721 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.721 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.721 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.721 13:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.721 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.721 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.721 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.979 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.979 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.979 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.979 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:32.979 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.237 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.237 "name": "BaseBdev2", 00:17:33.237 "aliases": [ 00:17:33.237 "e1ac618c-ddb1-4367-9852-cb4d41946ccc" 00:17:33.237 ], 00:17:33.237 "product_name": "Malloc disk", 00:17:33.237 "block_size": 512, 00:17:33.237 "num_blocks": 65536, 00:17:33.237 "uuid": "e1ac618c-ddb1-4367-9852-cb4d41946ccc", 00:17:33.238 "assigned_rate_limits": { 00:17:33.238 "rw_ios_per_sec": 0, 00:17:33.238 "rw_mbytes_per_sec": 0, 00:17:33.238 "r_mbytes_per_sec": 0, 00:17:33.238 "w_mbytes_per_sec": 0 00:17:33.238 }, 00:17:33.238 "claimed": true, 00:17:33.238 "claim_type": "exclusive_write", 00:17:33.238 "zoned": false, 00:17:33.238 "supported_io_types": { 00:17:33.238 "read": true, 00:17:33.238 "write": true, 00:17:33.238 "unmap": true, 00:17:33.238 "flush": true, 00:17:33.238 "reset": true, 00:17:33.238 "nvme_admin": false, 00:17:33.238 "nvme_io": false, 00:17:33.238 "nvme_io_md": false, 00:17:33.238 "write_zeroes": true, 00:17:33.238 "zcopy": true, 00:17:33.238 "get_zone_info": false, 00:17:33.238 "zone_management": false, 00:17:33.238 "zone_append": false, 00:17:33.238 "compare": false, 00:17:33.238 "compare_and_write": false, 00:17:33.238 "abort": true, 00:17:33.238 "seek_hole": false, 00:17:33.238 "seek_data": false, 00:17:33.238 "copy": true, 00:17:33.238 "nvme_iov_md": false 00:17:33.238 }, 00:17:33.238 "memory_domains": [ 00:17:33.238 { 00:17:33.238 "dma_device_id": "system", 00:17:33.238 "dma_device_type": 1 00:17:33.238 }, 00:17:33.238 { 00:17:33.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.238 "dma_device_type": 2 00:17:33.238 } 00:17:33.238 ], 00:17:33.238 "driver_specific": {} 00:17:33.238 }' 00:17:33.238 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.238 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.238 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.238 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.496 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.754 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.754 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:33.754 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:33.754 13:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.754 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.754 "name": "BaseBdev3", 00:17:33.754 "aliases": [ 00:17:33.754 "a8df66d8-f3b1-48b6-844b-6678ddb89217" 00:17:33.754 ], 00:17:33.754 "product_name": "Malloc disk", 00:17:33.754 "block_size": 512, 00:17:33.754 "num_blocks": 65536, 00:17:33.754 "uuid": "a8df66d8-f3b1-48b6-844b-6678ddb89217", 00:17:33.754 "assigned_rate_limits": { 00:17:33.754 "rw_ios_per_sec": 0, 00:17:33.754 "rw_mbytes_per_sec": 0, 00:17:33.754 "r_mbytes_per_sec": 0, 00:17:33.754 "w_mbytes_per_sec": 0 00:17:33.754 }, 00:17:33.754 "claimed": true, 00:17:33.754 "claim_type": "exclusive_write", 00:17:33.754 "zoned": false, 00:17:33.754 "supported_io_types": { 00:17:33.754 "read": true, 00:17:33.754 "write": true, 00:17:33.754 "unmap": true, 00:17:33.754 "flush": true, 00:17:33.754 "reset": true, 00:17:33.754 "nvme_admin": false, 00:17:33.754 "nvme_io": false, 00:17:33.754 "nvme_io_md": false, 00:17:33.754 "write_zeroes": true, 00:17:33.754 "zcopy": true, 00:17:33.754 "get_zone_info": false, 00:17:33.754 "zone_management": false, 00:17:33.754 "zone_append": false, 00:17:33.754 "compare": false, 00:17:33.754 "compare_and_write": false, 00:17:33.754 "abort": true, 00:17:33.754 "seek_hole": false, 00:17:33.754 "seek_data": false, 00:17:33.754 "copy": true, 00:17:33.754 "nvme_iov_md": false 00:17:33.754 }, 00:17:33.754 "memory_domains": [ 00:17:33.754 { 00:17:33.754 "dma_device_id": "system", 00:17:33.754 "dma_device_type": 1 00:17:33.754 }, 00:17:33.754 { 00:17:33.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.754 "dma_device_type": 2 00:17:33.754 } 00:17:33.754 ], 00:17:33.754 "driver_specific": {} 00:17:33.754 }' 00:17:33.754 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:34.011 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:34.270 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:34.528 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:34.528 "name": "BaseBdev4", 00:17:34.528 "aliases": [ 00:17:34.528 "54d7e117-3a9c-4acb-9672-e6b507bc02e8" 00:17:34.528 ], 00:17:34.528 "product_name": "Malloc disk", 00:17:34.528 "block_size": 512, 00:17:34.528 "num_blocks": 65536, 00:17:34.528 "uuid": "54d7e117-3a9c-4acb-9672-e6b507bc02e8", 00:17:34.528 "assigned_rate_limits": { 00:17:34.528 "rw_ios_per_sec": 0, 00:17:34.528 "rw_mbytes_per_sec": 0, 00:17:34.528 "r_mbytes_per_sec": 0, 00:17:34.528 "w_mbytes_per_sec": 0 00:17:34.528 }, 00:17:34.528 "claimed": true, 00:17:34.528 "claim_type": "exclusive_write", 00:17:34.528 "zoned": false, 00:17:34.528 "supported_io_types": { 00:17:34.528 "read": true, 00:17:34.528 "write": true, 00:17:34.528 "unmap": true, 00:17:34.528 "flush": true, 00:17:34.528 "reset": true, 00:17:34.528 "nvme_admin": false, 00:17:34.528 "nvme_io": false, 00:17:34.528 "nvme_io_md": false, 00:17:34.528 "write_zeroes": true, 00:17:34.528 "zcopy": true, 00:17:34.528 "get_zone_info": false, 00:17:34.528 "zone_management": false, 00:17:34.528 "zone_append": false, 00:17:34.528 "compare": false, 00:17:34.528 "compare_and_write": false, 00:17:34.528 "abort": true, 00:17:34.528 "seek_hole": false, 00:17:34.528 "seek_data": false, 00:17:34.528 "copy": true, 00:17:34.528 "nvme_iov_md": false 00:17:34.528 }, 00:17:34.528 "memory_domains": [ 00:17:34.528 { 00:17:34.528 "dma_device_id": "system", 00:17:34.528 "dma_device_type": 1 00:17:34.528 }, 00:17:34.528 { 00:17:34.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.528 "dma_device_type": 2 00:17:34.528 } 00:17:34.528 ], 00:17:34.528 "driver_specific": {} 00:17:34.528 }' 00:17:34.528 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.528 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.528 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:34.528 13:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.785 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:35.042 [2024-07-15 13:36:14.387564] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:35.042 [2024-07-15 13:36:14.387588] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:35.042 [2024-07-15 13:36:14.387637] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:35.042 [2024-07-15 13:36:14.387696] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:35.042 [2024-07-15 13:36:14.387708] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b39040 name Existed_Raid, state offline 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2127802 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2127802 ']' 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2127802 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2127802 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2127802' 00:17:35.042 killing process with pid 2127802 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2127802 00:17:35.042 [2024-07-15 13:36:14.456260] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:35.042 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2127802 00:17:35.299 [2024-07-15 13:36:14.492976] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:35.299 13:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:35.299 00:17:35.299 real 0m33.417s 00:17:35.299 user 1m1.572s 00:17:35.299 sys 0m5.661s 00:17:35.299 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:35.299 13:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.299 ************************************ 00:17:35.299 END TEST raid_state_function_test 00:17:35.299 ************************************ 00:17:35.557 13:36:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:35.557 13:36:14 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:17:35.557 13:36:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:35.557 13:36:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:35.557 13:36:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:35.557 ************************************ 00:17:35.557 START TEST raid_state_function_test_sb 00:17:35.557 ************************************ 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2133212 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2133212' 00:17:35.557 Process raid pid: 2133212 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2133212 /var/tmp/spdk-raid.sock 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2133212 ']' 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:35.557 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:35.557 13:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:35.557 [2024-07-15 13:36:14.831813] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:17:35.557 [2024-07-15 13:36:14.831875] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:35.557 [2024-07-15 13:36:14.962029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:35.816 [2024-07-15 13:36:15.067149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:35.816 [2024-07-15 13:36:15.128221] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.816 [2024-07-15 13:36:15.128249] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.073 13:36:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:36.073 13:36:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:36.073 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:36.331 [2024-07-15 13:36:15.521223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:36.331 [2024-07-15 13:36:15.521264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:36.331 [2024-07-15 13:36:15.521275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:36.331 [2024-07-15 13:36:15.521287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:36.331 [2024-07-15 13:36:15.521295] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:36.331 [2024-07-15 13:36:15.521307] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:36.331 [2024-07-15 13:36:15.521316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:36.331 [2024-07-15 13:36:15.521327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.331 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.588 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.588 "name": "Existed_Raid", 00:17:36.588 "uuid": "28f110eb-bf96-4e93-b1c8-4d244df8530f", 00:17:36.588 "strip_size_kb": 64, 00:17:36.588 "state": "configuring", 00:17:36.588 "raid_level": "raid0", 00:17:36.588 "superblock": true, 00:17:36.588 "num_base_bdevs": 4, 00:17:36.588 "num_base_bdevs_discovered": 0, 00:17:36.588 "num_base_bdevs_operational": 4, 00:17:36.588 "base_bdevs_list": [ 00:17:36.588 { 00:17:36.588 "name": "BaseBdev1", 00:17:36.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.588 "is_configured": false, 00:17:36.588 "data_offset": 0, 00:17:36.588 "data_size": 0 00:17:36.588 }, 00:17:36.588 { 00:17:36.588 "name": "BaseBdev2", 00:17:36.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.588 "is_configured": false, 00:17:36.588 "data_offset": 0, 00:17:36.588 "data_size": 0 00:17:36.588 }, 00:17:36.588 { 00:17:36.588 "name": "BaseBdev3", 00:17:36.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.588 "is_configured": false, 00:17:36.588 "data_offset": 0, 00:17:36.588 "data_size": 0 00:17:36.588 }, 00:17:36.588 { 00:17:36.588 "name": "BaseBdev4", 00:17:36.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.588 "is_configured": false, 00:17:36.588 "data_offset": 0, 00:17:36.588 "data_size": 0 00:17:36.588 } 00:17:36.588 ] 00:17:36.588 }' 00:17:36.588 13:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.588 13:36:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:37.153 13:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:37.412 [2024-07-15 13:36:16.579879] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:37.412 [2024-07-15 13:36:16.579908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9cdaa0 name Existed_Raid, state configuring 00:17:37.412 13:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:37.412 [2024-07-15 13:36:16.756392] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.412 [2024-07-15 13:36:16.756420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.412 [2024-07-15 13:36:16.756430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:37.412 [2024-07-15 13:36:16.756441] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:37.412 [2024-07-15 13:36:16.756450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:37.412 [2024-07-15 13:36:16.756462] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:37.412 [2024-07-15 13:36:16.756471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:37.412 [2024-07-15 13:36:16.756482] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:37.412 13:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:37.670 [2024-07-15 13:36:16.930684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:37.670 BaseBdev1 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.670 13:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:37.929 [ 00:17:37.929 { 00:17:37.929 "name": "BaseBdev1", 00:17:37.929 "aliases": [ 00:17:37.929 "16297227-e3ec-4b34-84cf-758055d51484" 00:17:37.929 ], 00:17:37.929 "product_name": "Malloc disk", 00:17:37.929 "block_size": 512, 00:17:37.929 "num_blocks": 65536, 00:17:37.929 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:37.929 "assigned_rate_limits": { 00:17:37.929 "rw_ios_per_sec": 0, 00:17:37.929 "rw_mbytes_per_sec": 0, 00:17:37.929 "r_mbytes_per_sec": 0, 00:17:37.929 "w_mbytes_per_sec": 0 00:17:37.929 }, 00:17:37.929 "claimed": true, 00:17:37.929 "claim_type": "exclusive_write", 00:17:37.929 "zoned": false, 00:17:37.929 "supported_io_types": { 00:17:37.929 "read": true, 00:17:37.929 "write": true, 00:17:37.929 "unmap": true, 00:17:37.929 "flush": true, 00:17:37.929 "reset": true, 00:17:37.929 "nvme_admin": false, 00:17:37.929 "nvme_io": false, 00:17:37.929 "nvme_io_md": false, 00:17:37.929 "write_zeroes": true, 00:17:37.929 "zcopy": true, 00:17:37.929 "get_zone_info": false, 00:17:37.929 "zone_management": false, 00:17:37.929 "zone_append": false, 00:17:37.929 "compare": false, 00:17:37.929 "compare_and_write": false, 00:17:37.929 "abort": true, 00:17:37.929 "seek_hole": false, 00:17:37.929 "seek_data": false, 00:17:37.929 "copy": true, 00:17:37.929 "nvme_iov_md": false 00:17:37.929 }, 00:17:37.929 "memory_domains": [ 00:17:37.929 { 00:17:37.929 "dma_device_id": "system", 00:17:37.929 "dma_device_type": 1 00:17:37.929 }, 00:17:37.929 { 00:17:37.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.929 "dma_device_type": 2 00:17:37.929 } 00:17:37.929 ], 00:17:37.929 "driver_specific": {} 00:17:37.929 } 00:17:37.929 ] 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.929 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.188 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.188 "name": "Existed_Raid", 00:17:38.188 "uuid": "00087897-9133-4092-936a-7eaef1a9c29f", 00:17:38.188 "strip_size_kb": 64, 00:17:38.188 "state": "configuring", 00:17:38.188 "raid_level": "raid0", 00:17:38.188 "superblock": true, 00:17:38.188 "num_base_bdevs": 4, 00:17:38.188 "num_base_bdevs_discovered": 1, 00:17:38.188 "num_base_bdevs_operational": 4, 00:17:38.188 "base_bdevs_list": [ 00:17:38.188 { 00:17:38.188 "name": "BaseBdev1", 00:17:38.188 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:38.188 "is_configured": true, 00:17:38.188 "data_offset": 2048, 00:17:38.188 "data_size": 63488 00:17:38.188 }, 00:17:38.188 { 00:17:38.188 "name": "BaseBdev2", 00:17:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.188 "is_configured": false, 00:17:38.188 "data_offset": 0, 00:17:38.188 "data_size": 0 00:17:38.188 }, 00:17:38.188 { 00:17:38.188 "name": "BaseBdev3", 00:17:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.188 "is_configured": false, 00:17:38.188 "data_offset": 0, 00:17:38.188 "data_size": 0 00:17:38.188 }, 00:17:38.188 { 00:17:38.188 "name": "BaseBdev4", 00:17:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.188 "is_configured": false, 00:17:38.188 "data_offset": 0, 00:17:38.188 "data_size": 0 00:17:38.188 } 00:17:38.188 ] 00:17:38.188 }' 00:17:38.188 13:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.188 13:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:38.755 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:39.013 [2024-07-15 13:36:18.358461] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:39.013 [2024-07-15 13:36:18.358498] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9cd310 name Existed_Raid, state configuring 00:17:39.013 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:39.271 [2024-07-15 13:36:18.534979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.271 [2024-07-15 13:36:18.536400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:39.271 [2024-07-15 13:36:18.536430] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:39.271 [2024-07-15 13:36:18.536440] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:39.271 [2024-07-15 13:36:18.536452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:39.271 [2024-07-15 13:36:18.536460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:39.271 [2024-07-15 13:36:18.536472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.271 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.530 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.530 "name": "Existed_Raid", 00:17:39.530 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:39.530 "strip_size_kb": 64, 00:17:39.530 "state": "configuring", 00:17:39.530 "raid_level": "raid0", 00:17:39.530 "superblock": true, 00:17:39.530 "num_base_bdevs": 4, 00:17:39.530 "num_base_bdevs_discovered": 1, 00:17:39.530 "num_base_bdevs_operational": 4, 00:17:39.530 "base_bdevs_list": [ 00:17:39.530 { 00:17:39.530 "name": "BaseBdev1", 00:17:39.530 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:39.530 "is_configured": true, 00:17:39.530 "data_offset": 2048, 00:17:39.530 "data_size": 63488 00:17:39.530 }, 00:17:39.530 { 00:17:39.530 "name": "BaseBdev2", 00:17:39.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.530 "is_configured": false, 00:17:39.530 "data_offset": 0, 00:17:39.530 "data_size": 0 00:17:39.530 }, 00:17:39.530 { 00:17:39.530 "name": "BaseBdev3", 00:17:39.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.530 "is_configured": false, 00:17:39.530 "data_offset": 0, 00:17:39.530 "data_size": 0 00:17:39.530 }, 00:17:39.530 { 00:17:39.530 "name": "BaseBdev4", 00:17:39.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.530 "is_configured": false, 00:17:39.530 "data_offset": 0, 00:17:39.530 "data_size": 0 00:17:39.530 } 00:17:39.530 ] 00:17:39.530 }' 00:17:39.530 13:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.530 13:36:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:40.097 13:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:40.355 [2024-07-15 13:36:19.641234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:40.355 BaseBdev2 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:40.355 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:40.613 13:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:40.871 [ 00:17:40.871 { 00:17:40.871 "name": "BaseBdev2", 00:17:40.871 "aliases": [ 00:17:40.871 "6d29d505-efe9-4b47-917c-91158daeb2ac" 00:17:40.871 ], 00:17:40.871 "product_name": "Malloc disk", 00:17:40.871 "block_size": 512, 00:17:40.871 "num_blocks": 65536, 00:17:40.871 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:40.871 "assigned_rate_limits": { 00:17:40.871 "rw_ios_per_sec": 0, 00:17:40.871 "rw_mbytes_per_sec": 0, 00:17:40.871 "r_mbytes_per_sec": 0, 00:17:40.871 "w_mbytes_per_sec": 0 00:17:40.871 }, 00:17:40.871 "claimed": true, 00:17:40.871 "claim_type": "exclusive_write", 00:17:40.871 "zoned": false, 00:17:40.871 "supported_io_types": { 00:17:40.871 "read": true, 00:17:40.871 "write": true, 00:17:40.871 "unmap": true, 00:17:40.871 "flush": true, 00:17:40.871 "reset": true, 00:17:40.871 "nvme_admin": false, 00:17:40.871 "nvme_io": false, 00:17:40.871 "nvme_io_md": false, 00:17:40.871 "write_zeroes": true, 00:17:40.871 "zcopy": true, 00:17:40.871 "get_zone_info": false, 00:17:40.871 "zone_management": false, 00:17:40.871 "zone_append": false, 00:17:40.871 "compare": false, 00:17:40.871 "compare_and_write": false, 00:17:40.871 "abort": true, 00:17:40.871 "seek_hole": false, 00:17:40.871 "seek_data": false, 00:17:40.871 "copy": true, 00:17:40.871 "nvme_iov_md": false 00:17:40.871 }, 00:17:40.871 "memory_domains": [ 00:17:40.871 { 00:17:40.871 "dma_device_id": "system", 00:17:40.871 "dma_device_type": 1 00:17:40.871 }, 00:17:40.871 { 00:17:40.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.871 "dma_device_type": 2 00:17:40.871 } 00:17:40.871 ], 00:17:40.871 "driver_specific": {} 00:17:40.871 } 00:17:40.871 ] 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.871 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.129 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.129 "name": "Existed_Raid", 00:17:41.129 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:41.129 "strip_size_kb": 64, 00:17:41.129 "state": "configuring", 00:17:41.129 "raid_level": "raid0", 00:17:41.129 "superblock": true, 00:17:41.129 "num_base_bdevs": 4, 00:17:41.129 "num_base_bdevs_discovered": 2, 00:17:41.129 "num_base_bdevs_operational": 4, 00:17:41.129 "base_bdevs_list": [ 00:17:41.129 { 00:17:41.129 "name": "BaseBdev1", 00:17:41.129 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:41.129 "is_configured": true, 00:17:41.129 "data_offset": 2048, 00:17:41.129 "data_size": 63488 00:17:41.129 }, 00:17:41.129 { 00:17:41.129 "name": "BaseBdev2", 00:17:41.129 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:41.129 "is_configured": true, 00:17:41.129 "data_offset": 2048, 00:17:41.129 "data_size": 63488 00:17:41.129 }, 00:17:41.129 { 00:17:41.129 "name": "BaseBdev3", 00:17:41.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.129 "is_configured": false, 00:17:41.129 "data_offset": 0, 00:17:41.129 "data_size": 0 00:17:41.129 }, 00:17:41.129 { 00:17:41.129 "name": "BaseBdev4", 00:17:41.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.129 "is_configured": false, 00:17:41.129 "data_offset": 0, 00:17:41.129 "data_size": 0 00:17:41.129 } 00:17:41.129 ] 00:17:41.129 }' 00:17:41.129 13:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.129 13:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.694 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:41.952 [2024-07-15 13:36:21.236971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.952 BaseBdev3 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:41.952 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:42.210 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:42.468 [ 00:17:42.468 { 00:17:42.468 "name": "BaseBdev3", 00:17:42.468 "aliases": [ 00:17:42.468 "aa4d7442-0e97-4706-8a8f-9d0ecdb41276" 00:17:42.468 ], 00:17:42.468 "product_name": "Malloc disk", 00:17:42.468 "block_size": 512, 00:17:42.468 "num_blocks": 65536, 00:17:42.468 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:42.468 "assigned_rate_limits": { 00:17:42.468 "rw_ios_per_sec": 0, 00:17:42.468 "rw_mbytes_per_sec": 0, 00:17:42.468 "r_mbytes_per_sec": 0, 00:17:42.468 "w_mbytes_per_sec": 0 00:17:42.468 }, 00:17:42.468 "claimed": true, 00:17:42.468 "claim_type": "exclusive_write", 00:17:42.468 "zoned": false, 00:17:42.468 "supported_io_types": { 00:17:42.468 "read": true, 00:17:42.468 "write": true, 00:17:42.468 "unmap": true, 00:17:42.468 "flush": true, 00:17:42.468 "reset": true, 00:17:42.468 "nvme_admin": false, 00:17:42.468 "nvme_io": false, 00:17:42.468 "nvme_io_md": false, 00:17:42.468 "write_zeroes": true, 00:17:42.468 "zcopy": true, 00:17:42.468 "get_zone_info": false, 00:17:42.468 "zone_management": false, 00:17:42.468 "zone_append": false, 00:17:42.468 "compare": false, 00:17:42.468 "compare_and_write": false, 00:17:42.468 "abort": true, 00:17:42.468 "seek_hole": false, 00:17:42.468 "seek_data": false, 00:17:42.468 "copy": true, 00:17:42.468 "nvme_iov_md": false 00:17:42.468 }, 00:17:42.468 "memory_domains": [ 00:17:42.468 { 00:17:42.468 "dma_device_id": "system", 00:17:42.468 "dma_device_type": 1 00:17:42.468 }, 00:17:42.468 { 00:17:42.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.468 "dma_device_type": 2 00:17:42.468 } 00:17:42.468 ], 00:17:42.468 "driver_specific": {} 00:17:42.468 } 00:17:42.468 ] 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.468 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.726 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.726 "name": "Existed_Raid", 00:17:42.726 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:42.726 "strip_size_kb": 64, 00:17:42.726 "state": "configuring", 00:17:42.726 "raid_level": "raid0", 00:17:42.726 "superblock": true, 00:17:42.726 "num_base_bdevs": 4, 00:17:42.726 "num_base_bdevs_discovered": 3, 00:17:42.726 "num_base_bdevs_operational": 4, 00:17:42.726 "base_bdevs_list": [ 00:17:42.726 { 00:17:42.726 "name": "BaseBdev1", 00:17:42.726 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:42.726 "is_configured": true, 00:17:42.726 "data_offset": 2048, 00:17:42.726 "data_size": 63488 00:17:42.726 }, 00:17:42.726 { 00:17:42.726 "name": "BaseBdev2", 00:17:42.726 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:42.726 "is_configured": true, 00:17:42.726 "data_offset": 2048, 00:17:42.726 "data_size": 63488 00:17:42.726 }, 00:17:42.726 { 00:17:42.726 "name": "BaseBdev3", 00:17:42.726 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:42.726 "is_configured": true, 00:17:42.726 "data_offset": 2048, 00:17:42.726 "data_size": 63488 00:17:42.726 }, 00:17:42.726 { 00:17:42.726 "name": "BaseBdev4", 00:17:42.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.726 "is_configured": false, 00:17:42.726 "data_offset": 0, 00:17:42.726 "data_size": 0 00:17:42.726 } 00:17:42.726 ] 00:17:42.726 }' 00:17:42.726 13:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.726 13:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:43.300 13:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:43.558 [2024-07-15 13:36:22.772369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:43.558 [2024-07-15 13:36:22.772537] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9ce350 00:17:43.558 [2024-07-15 13:36:22.772551] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:43.558 [2024-07-15 13:36:22.772720] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ce020 00:17:43.558 [2024-07-15 13:36:22.772834] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9ce350 00:17:43.558 [2024-07-15 13:36:22.772844] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9ce350 00:17:43.558 [2024-07-15 13:36:22.772952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.558 BaseBdev4 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.558 13:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.816 13:36:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:44.072 [ 00:17:44.072 { 00:17:44.072 "name": "BaseBdev4", 00:17:44.072 "aliases": [ 00:17:44.072 "0735233d-6d26-489a-9ea6-f25f62b2d296" 00:17:44.072 ], 00:17:44.072 "product_name": "Malloc disk", 00:17:44.072 "block_size": 512, 00:17:44.072 "num_blocks": 65536, 00:17:44.072 "uuid": "0735233d-6d26-489a-9ea6-f25f62b2d296", 00:17:44.072 "assigned_rate_limits": { 00:17:44.072 "rw_ios_per_sec": 0, 00:17:44.072 "rw_mbytes_per_sec": 0, 00:17:44.072 "r_mbytes_per_sec": 0, 00:17:44.072 "w_mbytes_per_sec": 0 00:17:44.072 }, 00:17:44.072 "claimed": true, 00:17:44.072 "claim_type": "exclusive_write", 00:17:44.072 "zoned": false, 00:17:44.072 "supported_io_types": { 00:17:44.072 "read": true, 00:17:44.072 "write": true, 00:17:44.072 "unmap": true, 00:17:44.072 "flush": true, 00:17:44.072 "reset": true, 00:17:44.072 "nvme_admin": false, 00:17:44.072 "nvme_io": false, 00:17:44.072 "nvme_io_md": false, 00:17:44.072 "write_zeroes": true, 00:17:44.072 "zcopy": true, 00:17:44.072 "get_zone_info": false, 00:17:44.072 "zone_management": false, 00:17:44.072 "zone_append": false, 00:17:44.072 "compare": false, 00:17:44.072 "compare_and_write": false, 00:17:44.072 "abort": true, 00:17:44.072 "seek_hole": false, 00:17:44.072 "seek_data": false, 00:17:44.072 "copy": true, 00:17:44.072 "nvme_iov_md": false 00:17:44.072 }, 00:17:44.072 "memory_domains": [ 00:17:44.072 { 00:17:44.072 "dma_device_id": "system", 00:17:44.072 "dma_device_type": 1 00:17:44.072 }, 00:17:44.072 { 00:17:44.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.072 "dma_device_type": 2 00:17:44.072 } 00:17:44.072 ], 00:17:44.072 "driver_specific": {} 00:17:44.072 } 00:17:44.072 ] 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.072 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.328 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.328 "name": "Existed_Raid", 00:17:44.328 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:44.328 "strip_size_kb": 64, 00:17:44.328 "state": "online", 00:17:44.328 "raid_level": "raid0", 00:17:44.328 "superblock": true, 00:17:44.328 "num_base_bdevs": 4, 00:17:44.328 "num_base_bdevs_discovered": 4, 00:17:44.328 "num_base_bdevs_operational": 4, 00:17:44.328 "base_bdevs_list": [ 00:17:44.328 { 00:17:44.328 "name": "BaseBdev1", 00:17:44.328 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:44.328 "is_configured": true, 00:17:44.328 "data_offset": 2048, 00:17:44.328 "data_size": 63488 00:17:44.328 }, 00:17:44.328 { 00:17:44.328 "name": "BaseBdev2", 00:17:44.328 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:44.328 "is_configured": true, 00:17:44.328 "data_offset": 2048, 00:17:44.328 "data_size": 63488 00:17:44.328 }, 00:17:44.328 { 00:17:44.328 "name": "BaseBdev3", 00:17:44.328 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:44.328 "is_configured": true, 00:17:44.328 "data_offset": 2048, 00:17:44.328 "data_size": 63488 00:17:44.328 }, 00:17:44.328 { 00:17:44.328 "name": "BaseBdev4", 00:17:44.329 "uuid": "0735233d-6d26-489a-9ea6-f25f62b2d296", 00:17:44.329 "is_configured": true, 00:17:44.329 "data_offset": 2048, 00:17:44.329 "data_size": 63488 00:17:44.329 } 00:17:44.329 ] 00:17:44.329 }' 00:17:44.329 13:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.329 13:36:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:44.902 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:44.902 [2024-07-15 13:36:24.320823] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:45.176 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:45.176 "name": "Existed_Raid", 00:17:45.176 "aliases": [ 00:17:45.176 "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c" 00:17:45.176 ], 00:17:45.176 "product_name": "Raid Volume", 00:17:45.176 "block_size": 512, 00:17:45.176 "num_blocks": 253952, 00:17:45.176 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:45.176 "assigned_rate_limits": { 00:17:45.176 "rw_ios_per_sec": 0, 00:17:45.176 "rw_mbytes_per_sec": 0, 00:17:45.176 "r_mbytes_per_sec": 0, 00:17:45.176 "w_mbytes_per_sec": 0 00:17:45.176 }, 00:17:45.176 "claimed": false, 00:17:45.176 "zoned": false, 00:17:45.176 "supported_io_types": { 00:17:45.176 "read": true, 00:17:45.176 "write": true, 00:17:45.176 "unmap": true, 00:17:45.176 "flush": true, 00:17:45.176 "reset": true, 00:17:45.176 "nvme_admin": false, 00:17:45.176 "nvme_io": false, 00:17:45.176 "nvme_io_md": false, 00:17:45.176 "write_zeroes": true, 00:17:45.176 "zcopy": false, 00:17:45.176 "get_zone_info": false, 00:17:45.176 "zone_management": false, 00:17:45.176 "zone_append": false, 00:17:45.176 "compare": false, 00:17:45.176 "compare_and_write": false, 00:17:45.176 "abort": false, 00:17:45.176 "seek_hole": false, 00:17:45.176 "seek_data": false, 00:17:45.176 "copy": false, 00:17:45.176 "nvme_iov_md": false 00:17:45.176 }, 00:17:45.176 "memory_domains": [ 00:17:45.176 { 00:17:45.176 "dma_device_id": "system", 00:17:45.176 "dma_device_type": 1 00:17:45.176 }, 00:17:45.176 { 00:17:45.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.176 "dma_device_type": 2 00:17:45.176 }, 00:17:45.176 { 00:17:45.176 "dma_device_id": "system", 00:17:45.176 "dma_device_type": 1 00:17:45.176 }, 00:17:45.176 { 00:17:45.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.176 "dma_device_type": 2 00:17:45.176 }, 00:17:45.176 { 00:17:45.176 "dma_device_id": "system", 00:17:45.176 "dma_device_type": 1 00:17:45.176 }, 00:17:45.176 { 00:17:45.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.176 "dma_device_type": 2 00:17:45.176 }, 00:17:45.177 { 00:17:45.177 "dma_device_id": "system", 00:17:45.177 "dma_device_type": 1 00:17:45.177 }, 00:17:45.177 { 00:17:45.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.177 "dma_device_type": 2 00:17:45.177 } 00:17:45.177 ], 00:17:45.177 "driver_specific": { 00:17:45.177 "raid": { 00:17:45.177 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:45.177 "strip_size_kb": 64, 00:17:45.177 "state": "online", 00:17:45.177 "raid_level": "raid0", 00:17:45.177 "superblock": true, 00:17:45.177 "num_base_bdevs": 4, 00:17:45.177 "num_base_bdevs_discovered": 4, 00:17:45.177 "num_base_bdevs_operational": 4, 00:17:45.177 "base_bdevs_list": [ 00:17:45.177 { 00:17:45.177 "name": "BaseBdev1", 00:17:45.177 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:45.177 "is_configured": true, 00:17:45.177 "data_offset": 2048, 00:17:45.177 "data_size": 63488 00:17:45.177 }, 00:17:45.177 { 00:17:45.177 "name": "BaseBdev2", 00:17:45.177 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:45.177 "is_configured": true, 00:17:45.177 "data_offset": 2048, 00:17:45.177 "data_size": 63488 00:17:45.177 }, 00:17:45.177 { 00:17:45.177 "name": "BaseBdev3", 00:17:45.177 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:45.177 "is_configured": true, 00:17:45.177 "data_offset": 2048, 00:17:45.177 "data_size": 63488 00:17:45.177 }, 00:17:45.177 { 00:17:45.177 "name": "BaseBdev4", 00:17:45.177 "uuid": "0735233d-6d26-489a-9ea6-f25f62b2d296", 00:17:45.177 "is_configured": true, 00:17:45.177 "data_offset": 2048, 00:17:45.177 "data_size": 63488 00:17:45.177 } 00:17:45.177 ] 00:17:45.177 } 00:17:45.177 } 00:17:45.177 }' 00:17:45.177 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:45.177 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:45.177 BaseBdev2 00:17:45.177 BaseBdev3 00:17:45.177 BaseBdev4' 00:17:45.177 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.177 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.177 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:45.741 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.741 "name": "BaseBdev1", 00:17:45.741 "aliases": [ 00:17:45.741 "16297227-e3ec-4b34-84cf-758055d51484" 00:17:45.741 ], 00:17:45.741 "product_name": "Malloc disk", 00:17:45.741 "block_size": 512, 00:17:45.741 "num_blocks": 65536, 00:17:45.741 "uuid": "16297227-e3ec-4b34-84cf-758055d51484", 00:17:45.741 "assigned_rate_limits": { 00:17:45.741 "rw_ios_per_sec": 0, 00:17:45.741 "rw_mbytes_per_sec": 0, 00:17:45.741 "r_mbytes_per_sec": 0, 00:17:45.741 "w_mbytes_per_sec": 0 00:17:45.741 }, 00:17:45.741 "claimed": true, 00:17:45.741 "claim_type": "exclusive_write", 00:17:45.741 "zoned": false, 00:17:45.741 "supported_io_types": { 00:17:45.741 "read": true, 00:17:45.741 "write": true, 00:17:45.741 "unmap": true, 00:17:45.741 "flush": true, 00:17:45.741 "reset": true, 00:17:45.741 "nvme_admin": false, 00:17:45.741 "nvme_io": false, 00:17:45.741 "nvme_io_md": false, 00:17:45.741 "write_zeroes": true, 00:17:45.741 "zcopy": true, 00:17:45.741 "get_zone_info": false, 00:17:45.741 "zone_management": false, 00:17:45.741 "zone_append": false, 00:17:45.741 "compare": false, 00:17:45.741 "compare_and_write": false, 00:17:45.741 "abort": true, 00:17:45.741 "seek_hole": false, 00:17:45.741 "seek_data": false, 00:17:45.741 "copy": true, 00:17:45.741 "nvme_iov_md": false 00:17:45.741 }, 00:17:45.741 "memory_domains": [ 00:17:45.741 { 00:17:45.741 "dma_device_id": "system", 00:17:45.741 "dma_device_type": 1 00:17:45.741 }, 00:17:45.741 { 00:17:45.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.741 "dma_device_type": 2 00:17:45.741 } 00:17:45.741 ], 00:17:45.741 "driver_specific": {} 00:17:45.741 }' 00:17:45.741 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.741 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.741 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.741 13:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.741 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.741 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.741 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.741 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:45.999 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.258 "name": "BaseBdev2", 00:17:46.258 "aliases": [ 00:17:46.258 "6d29d505-efe9-4b47-917c-91158daeb2ac" 00:17:46.258 ], 00:17:46.258 "product_name": "Malloc disk", 00:17:46.258 "block_size": 512, 00:17:46.258 "num_blocks": 65536, 00:17:46.258 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:46.258 "assigned_rate_limits": { 00:17:46.258 "rw_ios_per_sec": 0, 00:17:46.258 "rw_mbytes_per_sec": 0, 00:17:46.258 "r_mbytes_per_sec": 0, 00:17:46.258 "w_mbytes_per_sec": 0 00:17:46.258 }, 00:17:46.258 "claimed": true, 00:17:46.258 "claim_type": "exclusive_write", 00:17:46.258 "zoned": false, 00:17:46.258 "supported_io_types": { 00:17:46.258 "read": true, 00:17:46.258 "write": true, 00:17:46.258 "unmap": true, 00:17:46.258 "flush": true, 00:17:46.258 "reset": true, 00:17:46.258 "nvme_admin": false, 00:17:46.258 "nvme_io": false, 00:17:46.258 "nvme_io_md": false, 00:17:46.258 "write_zeroes": true, 00:17:46.258 "zcopy": true, 00:17:46.258 "get_zone_info": false, 00:17:46.258 "zone_management": false, 00:17:46.258 "zone_append": false, 00:17:46.258 "compare": false, 00:17:46.258 "compare_and_write": false, 00:17:46.258 "abort": true, 00:17:46.258 "seek_hole": false, 00:17:46.258 "seek_data": false, 00:17:46.258 "copy": true, 00:17:46.258 "nvme_iov_md": false 00:17:46.258 }, 00:17:46.258 "memory_domains": [ 00:17:46.258 { 00:17:46.258 "dma_device_id": "system", 00:17:46.258 "dma_device_type": 1 00:17:46.258 }, 00:17:46.258 { 00:17:46.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.258 "dma_device_type": 2 00:17:46.258 } 00:17:46.258 ], 00:17:46.258 "driver_specific": {} 00:17:46.258 }' 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.258 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:46.516 13:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.772 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.772 "name": "BaseBdev3", 00:17:46.772 "aliases": [ 00:17:46.772 "aa4d7442-0e97-4706-8a8f-9d0ecdb41276" 00:17:46.772 ], 00:17:46.772 "product_name": "Malloc disk", 00:17:46.772 "block_size": 512, 00:17:46.772 "num_blocks": 65536, 00:17:46.772 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:46.772 "assigned_rate_limits": { 00:17:46.772 "rw_ios_per_sec": 0, 00:17:46.772 "rw_mbytes_per_sec": 0, 00:17:46.772 "r_mbytes_per_sec": 0, 00:17:46.772 "w_mbytes_per_sec": 0 00:17:46.772 }, 00:17:46.772 "claimed": true, 00:17:46.772 "claim_type": "exclusive_write", 00:17:46.772 "zoned": false, 00:17:46.772 "supported_io_types": { 00:17:46.772 "read": true, 00:17:46.772 "write": true, 00:17:46.772 "unmap": true, 00:17:46.772 "flush": true, 00:17:46.772 "reset": true, 00:17:46.772 "nvme_admin": false, 00:17:46.772 "nvme_io": false, 00:17:46.772 "nvme_io_md": false, 00:17:46.772 "write_zeroes": true, 00:17:46.772 "zcopy": true, 00:17:46.772 "get_zone_info": false, 00:17:46.772 "zone_management": false, 00:17:46.772 "zone_append": false, 00:17:46.772 "compare": false, 00:17:46.772 "compare_and_write": false, 00:17:46.772 "abort": true, 00:17:46.772 "seek_hole": false, 00:17:46.772 "seek_data": false, 00:17:46.772 "copy": true, 00:17:46.772 "nvme_iov_md": false 00:17:46.772 }, 00:17:46.772 "memory_domains": [ 00:17:46.772 { 00:17:46.772 "dma_device_id": "system", 00:17:46.772 "dma_device_type": 1 00:17:46.772 }, 00:17:46.772 { 00:17:46.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.772 "dma_device_type": 2 00:17:46.772 } 00:17:46.772 ], 00:17:46.772 "driver_specific": {} 00:17:46.772 }' 00:17:46.772 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.772 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.772 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.772 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:47.028 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.285 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.285 "name": "BaseBdev4", 00:17:47.285 "aliases": [ 00:17:47.285 "0735233d-6d26-489a-9ea6-f25f62b2d296" 00:17:47.285 ], 00:17:47.285 "product_name": "Malloc disk", 00:17:47.285 "block_size": 512, 00:17:47.285 "num_blocks": 65536, 00:17:47.285 "uuid": "0735233d-6d26-489a-9ea6-f25f62b2d296", 00:17:47.285 "assigned_rate_limits": { 00:17:47.285 "rw_ios_per_sec": 0, 00:17:47.285 "rw_mbytes_per_sec": 0, 00:17:47.285 "r_mbytes_per_sec": 0, 00:17:47.285 "w_mbytes_per_sec": 0 00:17:47.285 }, 00:17:47.285 "claimed": true, 00:17:47.286 "claim_type": "exclusive_write", 00:17:47.286 "zoned": false, 00:17:47.286 "supported_io_types": { 00:17:47.286 "read": true, 00:17:47.286 "write": true, 00:17:47.286 "unmap": true, 00:17:47.286 "flush": true, 00:17:47.286 "reset": true, 00:17:47.286 "nvme_admin": false, 00:17:47.286 "nvme_io": false, 00:17:47.286 "nvme_io_md": false, 00:17:47.286 "write_zeroes": true, 00:17:47.286 "zcopy": true, 00:17:47.286 "get_zone_info": false, 00:17:47.286 "zone_management": false, 00:17:47.286 "zone_append": false, 00:17:47.286 "compare": false, 00:17:47.286 "compare_and_write": false, 00:17:47.286 "abort": true, 00:17:47.286 "seek_hole": false, 00:17:47.286 "seek_data": false, 00:17:47.286 "copy": true, 00:17:47.286 "nvme_iov_md": false 00:17:47.286 }, 00:17:47.286 "memory_domains": [ 00:17:47.286 { 00:17:47.286 "dma_device_id": "system", 00:17:47.286 "dma_device_type": 1 00:17:47.286 }, 00:17:47.286 { 00:17:47.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.286 "dma_device_type": 2 00:17:47.286 } 00:17:47.286 ], 00:17:47.286 "driver_specific": {} 00:17:47.286 }' 00:17:47.286 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.542 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.799 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.799 13:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.799 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.799 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.799 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:48.056 [2024-07-15 13:36:27.296471] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:48.056 [2024-07-15 13:36:27.296497] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:48.056 [2024-07-15 13:36:27.296544] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:48.056 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.057 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:48.314 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.314 "name": "Existed_Raid", 00:17:48.314 "uuid": "ac1fc4ad-e070-48b7-a2df-2daf3de8a30c", 00:17:48.314 "strip_size_kb": 64, 00:17:48.314 "state": "offline", 00:17:48.314 "raid_level": "raid0", 00:17:48.314 "superblock": true, 00:17:48.314 "num_base_bdevs": 4, 00:17:48.314 "num_base_bdevs_discovered": 3, 00:17:48.314 "num_base_bdevs_operational": 3, 00:17:48.314 "base_bdevs_list": [ 00:17:48.314 { 00:17:48.314 "name": null, 00:17:48.314 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:48.314 "is_configured": false, 00:17:48.314 "data_offset": 2048, 00:17:48.314 "data_size": 63488 00:17:48.314 }, 00:17:48.314 { 00:17:48.314 "name": "BaseBdev2", 00:17:48.314 "uuid": "6d29d505-efe9-4b47-917c-91158daeb2ac", 00:17:48.314 "is_configured": true, 00:17:48.314 "data_offset": 2048, 00:17:48.314 "data_size": 63488 00:17:48.314 }, 00:17:48.314 { 00:17:48.314 "name": "BaseBdev3", 00:17:48.314 "uuid": "aa4d7442-0e97-4706-8a8f-9d0ecdb41276", 00:17:48.314 "is_configured": true, 00:17:48.314 "data_offset": 2048, 00:17:48.314 "data_size": 63488 00:17:48.314 }, 00:17:48.314 { 00:17:48.314 "name": "BaseBdev4", 00:17:48.314 "uuid": "0735233d-6d26-489a-9ea6-f25f62b2d296", 00:17:48.314 "is_configured": true, 00:17:48.314 "data_offset": 2048, 00:17:48.314 "data_size": 63488 00:17:48.314 } 00:17:48.314 ] 00:17:48.314 }' 00:17:48.314 13:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.314 13:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.246 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:49.246 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:49.246 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.246 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:49.503 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:49.503 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:49.503 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:49.503 [2024-07-15 13:36:28.897799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:49.503 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:49.503 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:49.761 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:49.761 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.761 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:49.761 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:49.761 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:50.018 [2024-07-15 13:36:29.329515] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:50.018 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:50.018 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:50.018 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.018 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:50.276 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:50.276 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:50.276 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:50.534 [2024-07-15 13:36:29.829387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:50.534 [2024-07-15 13:36:29.829429] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9ce350 name Existed_Raid, state offline 00:17:50.534 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:50.534 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:50.534 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.534 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:50.792 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:51.050 BaseBdev2 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:51.050 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:51.308 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:51.565 [ 00:17:51.565 { 00:17:51.565 "name": "BaseBdev2", 00:17:51.565 "aliases": [ 00:17:51.565 "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c" 00:17:51.565 ], 00:17:51.565 "product_name": "Malloc disk", 00:17:51.565 "block_size": 512, 00:17:51.565 "num_blocks": 65536, 00:17:51.565 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:51.565 "assigned_rate_limits": { 00:17:51.565 "rw_ios_per_sec": 0, 00:17:51.565 "rw_mbytes_per_sec": 0, 00:17:51.565 "r_mbytes_per_sec": 0, 00:17:51.565 "w_mbytes_per_sec": 0 00:17:51.565 }, 00:17:51.565 "claimed": false, 00:17:51.565 "zoned": false, 00:17:51.565 "supported_io_types": { 00:17:51.565 "read": true, 00:17:51.565 "write": true, 00:17:51.565 "unmap": true, 00:17:51.565 "flush": true, 00:17:51.565 "reset": true, 00:17:51.565 "nvme_admin": false, 00:17:51.565 "nvme_io": false, 00:17:51.565 "nvme_io_md": false, 00:17:51.565 "write_zeroes": true, 00:17:51.565 "zcopy": true, 00:17:51.565 "get_zone_info": false, 00:17:51.565 "zone_management": false, 00:17:51.565 "zone_append": false, 00:17:51.565 "compare": false, 00:17:51.565 "compare_and_write": false, 00:17:51.565 "abort": true, 00:17:51.566 "seek_hole": false, 00:17:51.566 "seek_data": false, 00:17:51.566 "copy": true, 00:17:51.566 "nvme_iov_md": false 00:17:51.566 }, 00:17:51.566 "memory_domains": [ 00:17:51.566 { 00:17:51.566 "dma_device_id": "system", 00:17:51.566 "dma_device_type": 1 00:17:51.566 }, 00:17:51.566 { 00:17:51.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.566 "dma_device_type": 2 00:17:51.566 } 00:17:51.566 ], 00:17:51.566 "driver_specific": {} 00:17:51.566 } 00:17:51.566 ] 00:17:51.566 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:51.566 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:51.566 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:51.566 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:51.822 BaseBdev3 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:51.822 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.080 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:52.337 [ 00:17:52.337 { 00:17:52.337 "name": "BaseBdev3", 00:17:52.337 "aliases": [ 00:17:52.337 "bd08601d-4f08-4bc2-bf71-5be7222ae9a9" 00:17:52.337 ], 00:17:52.337 "product_name": "Malloc disk", 00:17:52.337 "block_size": 512, 00:17:52.337 "num_blocks": 65536, 00:17:52.337 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:52.337 "assigned_rate_limits": { 00:17:52.337 "rw_ios_per_sec": 0, 00:17:52.337 "rw_mbytes_per_sec": 0, 00:17:52.337 "r_mbytes_per_sec": 0, 00:17:52.337 "w_mbytes_per_sec": 0 00:17:52.337 }, 00:17:52.338 "claimed": false, 00:17:52.338 "zoned": false, 00:17:52.338 "supported_io_types": { 00:17:52.338 "read": true, 00:17:52.338 "write": true, 00:17:52.338 "unmap": true, 00:17:52.338 "flush": true, 00:17:52.338 "reset": true, 00:17:52.338 "nvme_admin": false, 00:17:52.338 "nvme_io": false, 00:17:52.338 "nvme_io_md": false, 00:17:52.338 "write_zeroes": true, 00:17:52.338 "zcopy": true, 00:17:52.338 "get_zone_info": false, 00:17:52.338 "zone_management": false, 00:17:52.338 "zone_append": false, 00:17:52.338 "compare": false, 00:17:52.338 "compare_and_write": false, 00:17:52.338 "abort": true, 00:17:52.338 "seek_hole": false, 00:17:52.338 "seek_data": false, 00:17:52.338 "copy": true, 00:17:52.338 "nvme_iov_md": false 00:17:52.338 }, 00:17:52.338 "memory_domains": [ 00:17:52.338 { 00:17:52.338 "dma_device_id": "system", 00:17:52.338 "dma_device_type": 1 00:17:52.338 }, 00:17:52.338 { 00:17:52.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.338 "dma_device_type": 2 00:17:52.338 } 00:17:52.338 ], 00:17:52.338 "driver_specific": {} 00:17:52.338 } 00:17:52.338 ] 00:17:52.338 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:52.338 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:52.338 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:52.338 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:52.595 BaseBdev4 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:52.595 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.853 13:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:52.854 [ 00:17:52.854 { 00:17:52.854 "name": "BaseBdev4", 00:17:52.854 "aliases": [ 00:17:52.854 "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19" 00:17:52.854 ], 00:17:52.854 "product_name": "Malloc disk", 00:17:52.854 "block_size": 512, 00:17:52.854 "num_blocks": 65536, 00:17:52.854 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:52.854 "assigned_rate_limits": { 00:17:52.854 "rw_ios_per_sec": 0, 00:17:52.854 "rw_mbytes_per_sec": 0, 00:17:52.854 "r_mbytes_per_sec": 0, 00:17:52.854 "w_mbytes_per_sec": 0 00:17:52.854 }, 00:17:52.854 "claimed": false, 00:17:52.854 "zoned": false, 00:17:52.854 "supported_io_types": { 00:17:52.854 "read": true, 00:17:52.854 "write": true, 00:17:52.854 "unmap": true, 00:17:52.854 "flush": true, 00:17:52.854 "reset": true, 00:17:52.854 "nvme_admin": false, 00:17:52.854 "nvme_io": false, 00:17:52.854 "nvme_io_md": false, 00:17:52.854 "write_zeroes": true, 00:17:52.854 "zcopy": true, 00:17:52.854 "get_zone_info": false, 00:17:52.854 "zone_management": false, 00:17:52.854 "zone_append": false, 00:17:52.854 "compare": false, 00:17:52.854 "compare_and_write": false, 00:17:52.854 "abort": true, 00:17:52.854 "seek_hole": false, 00:17:52.854 "seek_data": false, 00:17:52.854 "copy": true, 00:17:52.854 "nvme_iov_md": false 00:17:52.854 }, 00:17:52.854 "memory_domains": [ 00:17:52.854 { 00:17:52.854 "dma_device_id": "system", 00:17:52.854 "dma_device_type": 1 00:17:52.854 }, 00:17:52.854 { 00:17:52.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.854 "dma_device_type": 2 00:17:52.854 } 00:17:52.854 ], 00:17:52.854 "driver_specific": {} 00:17:52.854 } 00:17:52.854 ] 00:17:52.854 13:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:52.854 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:52.854 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:52.854 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:53.112 [2024-07-15 13:36:32.503569] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:53.112 [2024-07-15 13:36:32.503610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:53.112 [2024-07-15 13:36:32.503636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:53.112 [2024-07-15 13:36:32.504987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:53.112 [2024-07-15 13:36:32.505031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.112 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.369 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.369 "name": "Existed_Raid", 00:17:53.369 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:17:53.369 "strip_size_kb": 64, 00:17:53.369 "state": "configuring", 00:17:53.369 "raid_level": "raid0", 00:17:53.369 "superblock": true, 00:17:53.369 "num_base_bdevs": 4, 00:17:53.369 "num_base_bdevs_discovered": 3, 00:17:53.369 "num_base_bdevs_operational": 4, 00:17:53.369 "base_bdevs_list": [ 00:17:53.369 { 00:17:53.369 "name": "BaseBdev1", 00:17:53.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.369 "is_configured": false, 00:17:53.369 "data_offset": 0, 00:17:53.369 "data_size": 0 00:17:53.369 }, 00:17:53.369 { 00:17:53.369 "name": "BaseBdev2", 00:17:53.369 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:53.369 "is_configured": true, 00:17:53.369 "data_offset": 2048, 00:17:53.369 "data_size": 63488 00:17:53.369 }, 00:17:53.369 { 00:17:53.369 "name": "BaseBdev3", 00:17:53.369 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:53.369 "is_configured": true, 00:17:53.369 "data_offset": 2048, 00:17:53.369 "data_size": 63488 00:17:53.369 }, 00:17:53.369 { 00:17:53.369 "name": "BaseBdev4", 00:17:53.369 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:53.369 "is_configured": true, 00:17:53.369 "data_offset": 2048, 00:17:53.369 "data_size": 63488 00:17:53.369 } 00:17:53.369 ] 00:17:53.369 }' 00:17:53.369 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.369 13:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.934 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:54.191 [2024-07-15 13:36:33.506198] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.191 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.448 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.448 "name": "Existed_Raid", 00:17:54.448 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:17:54.448 "strip_size_kb": 64, 00:17:54.448 "state": "configuring", 00:17:54.448 "raid_level": "raid0", 00:17:54.448 "superblock": true, 00:17:54.448 "num_base_bdevs": 4, 00:17:54.448 "num_base_bdevs_discovered": 2, 00:17:54.448 "num_base_bdevs_operational": 4, 00:17:54.448 "base_bdevs_list": [ 00:17:54.448 { 00:17:54.448 "name": "BaseBdev1", 00:17:54.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.448 "is_configured": false, 00:17:54.448 "data_offset": 0, 00:17:54.448 "data_size": 0 00:17:54.448 }, 00:17:54.448 { 00:17:54.448 "name": null, 00:17:54.448 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:54.448 "is_configured": false, 00:17:54.448 "data_offset": 2048, 00:17:54.448 "data_size": 63488 00:17:54.448 }, 00:17:54.448 { 00:17:54.448 "name": "BaseBdev3", 00:17:54.448 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:54.448 "is_configured": true, 00:17:54.448 "data_offset": 2048, 00:17:54.448 "data_size": 63488 00:17:54.448 }, 00:17:54.448 { 00:17:54.448 "name": "BaseBdev4", 00:17:54.448 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:54.448 "is_configured": true, 00:17:54.448 "data_offset": 2048, 00:17:54.448 "data_size": 63488 00:17:54.448 } 00:17:54.448 ] 00:17:54.448 }' 00:17:54.448 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.448 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:55.011 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.011 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:55.267 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:55.267 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:55.534 [2024-07-15 13:36:34.869139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:55.534 BaseBdev1 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:55.534 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.793 13:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:56.050 [ 00:17:56.050 { 00:17:56.050 "name": "BaseBdev1", 00:17:56.050 "aliases": [ 00:17:56.050 "f58eedf7-2194-44f3-964a-a30ca63d39f5" 00:17:56.050 ], 00:17:56.050 "product_name": "Malloc disk", 00:17:56.050 "block_size": 512, 00:17:56.050 "num_blocks": 65536, 00:17:56.050 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:17:56.050 "assigned_rate_limits": { 00:17:56.050 "rw_ios_per_sec": 0, 00:17:56.050 "rw_mbytes_per_sec": 0, 00:17:56.050 "r_mbytes_per_sec": 0, 00:17:56.050 "w_mbytes_per_sec": 0 00:17:56.050 }, 00:17:56.050 "claimed": true, 00:17:56.050 "claim_type": "exclusive_write", 00:17:56.050 "zoned": false, 00:17:56.050 "supported_io_types": { 00:17:56.050 "read": true, 00:17:56.050 "write": true, 00:17:56.050 "unmap": true, 00:17:56.050 "flush": true, 00:17:56.050 "reset": true, 00:17:56.050 "nvme_admin": false, 00:17:56.050 "nvme_io": false, 00:17:56.050 "nvme_io_md": false, 00:17:56.050 "write_zeroes": true, 00:17:56.050 "zcopy": true, 00:17:56.050 "get_zone_info": false, 00:17:56.050 "zone_management": false, 00:17:56.050 "zone_append": false, 00:17:56.050 "compare": false, 00:17:56.050 "compare_and_write": false, 00:17:56.050 "abort": true, 00:17:56.050 "seek_hole": false, 00:17:56.050 "seek_data": false, 00:17:56.050 "copy": true, 00:17:56.050 "nvme_iov_md": false 00:17:56.050 }, 00:17:56.050 "memory_domains": [ 00:17:56.050 { 00:17:56.050 "dma_device_id": "system", 00:17:56.050 "dma_device_type": 1 00:17:56.050 }, 00:17:56.050 { 00:17:56.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.050 "dma_device_type": 2 00:17:56.050 } 00:17:56.050 ], 00:17:56.050 "driver_specific": {} 00:17:56.050 } 00:17:56.050 ] 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.050 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.308 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.308 "name": "Existed_Raid", 00:17:56.308 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:17:56.308 "strip_size_kb": 64, 00:17:56.308 "state": "configuring", 00:17:56.308 "raid_level": "raid0", 00:17:56.308 "superblock": true, 00:17:56.308 "num_base_bdevs": 4, 00:17:56.308 "num_base_bdevs_discovered": 3, 00:17:56.308 "num_base_bdevs_operational": 4, 00:17:56.308 "base_bdevs_list": [ 00:17:56.308 { 00:17:56.308 "name": "BaseBdev1", 00:17:56.308 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:17:56.308 "is_configured": true, 00:17:56.308 "data_offset": 2048, 00:17:56.308 "data_size": 63488 00:17:56.308 }, 00:17:56.308 { 00:17:56.308 "name": null, 00:17:56.308 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:56.308 "is_configured": false, 00:17:56.308 "data_offset": 2048, 00:17:56.308 "data_size": 63488 00:17:56.308 }, 00:17:56.308 { 00:17:56.308 "name": "BaseBdev3", 00:17:56.308 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:56.308 "is_configured": true, 00:17:56.308 "data_offset": 2048, 00:17:56.308 "data_size": 63488 00:17:56.308 }, 00:17:56.308 { 00:17:56.308 "name": "BaseBdev4", 00:17:56.308 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:56.308 "is_configured": true, 00:17:56.308 "data_offset": 2048, 00:17:56.308 "data_size": 63488 00:17:56.308 } 00:17:56.308 ] 00:17:56.308 }' 00:17:56.308 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.308 13:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.872 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.872 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:57.128 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:57.128 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:57.385 [2024-07-15 13:36:36.686010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.385 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.386 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.386 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.643 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.643 "name": "Existed_Raid", 00:17:57.643 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:17:57.643 "strip_size_kb": 64, 00:17:57.643 "state": "configuring", 00:17:57.643 "raid_level": "raid0", 00:17:57.643 "superblock": true, 00:17:57.643 "num_base_bdevs": 4, 00:17:57.643 "num_base_bdevs_discovered": 2, 00:17:57.643 "num_base_bdevs_operational": 4, 00:17:57.643 "base_bdevs_list": [ 00:17:57.643 { 00:17:57.643 "name": "BaseBdev1", 00:17:57.643 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:17:57.643 "is_configured": true, 00:17:57.643 "data_offset": 2048, 00:17:57.643 "data_size": 63488 00:17:57.643 }, 00:17:57.643 { 00:17:57.643 "name": null, 00:17:57.643 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:57.643 "is_configured": false, 00:17:57.643 "data_offset": 2048, 00:17:57.643 "data_size": 63488 00:17:57.643 }, 00:17:57.643 { 00:17:57.643 "name": null, 00:17:57.643 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:57.643 "is_configured": false, 00:17:57.643 "data_offset": 2048, 00:17:57.643 "data_size": 63488 00:17:57.643 }, 00:17:57.643 { 00:17:57.643 "name": "BaseBdev4", 00:17:57.643 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:57.643 "is_configured": true, 00:17:57.643 "data_offset": 2048, 00:17:57.643 "data_size": 63488 00:17:57.643 } 00:17:57.643 ] 00:17:57.643 }' 00:17:57.643 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.643 13:36:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.208 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.208 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:58.467 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:58.467 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:58.745 [2024-07-15 13:36:38.017549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.745 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.017 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.017 "name": "Existed_Raid", 00:17:59.017 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:17:59.017 "strip_size_kb": 64, 00:17:59.017 "state": "configuring", 00:17:59.017 "raid_level": "raid0", 00:17:59.017 "superblock": true, 00:17:59.017 "num_base_bdevs": 4, 00:17:59.017 "num_base_bdevs_discovered": 3, 00:17:59.017 "num_base_bdevs_operational": 4, 00:17:59.017 "base_bdevs_list": [ 00:17:59.017 { 00:17:59.017 "name": "BaseBdev1", 00:17:59.017 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:17:59.017 "is_configured": true, 00:17:59.017 "data_offset": 2048, 00:17:59.017 "data_size": 63488 00:17:59.017 }, 00:17:59.017 { 00:17:59.017 "name": null, 00:17:59.017 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:17:59.017 "is_configured": false, 00:17:59.017 "data_offset": 2048, 00:17:59.017 "data_size": 63488 00:17:59.017 }, 00:17:59.017 { 00:17:59.017 "name": "BaseBdev3", 00:17:59.017 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:17:59.017 "is_configured": true, 00:17:59.017 "data_offset": 2048, 00:17:59.017 "data_size": 63488 00:17:59.017 }, 00:17:59.017 { 00:17:59.017 "name": "BaseBdev4", 00:17:59.017 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:17:59.017 "is_configured": true, 00:17:59.017 "data_offset": 2048, 00:17:59.017 "data_size": 63488 00:17:59.017 } 00:17:59.017 ] 00:17:59.017 }' 00:17:59.017 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.017 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:59.583 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:59.583 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.841 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:59.841 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:00.099 [2024-07-15 13:36:39.353102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.099 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.357 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.357 "name": "Existed_Raid", 00:18:00.357 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:18:00.357 "strip_size_kb": 64, 00:18:00.357 "state": "configuring", 00:18:00.357 "raid_level": "raid0", 00:18:00.357 "superblock": true, 00:18:00.357 "num_base_bdevs": 4, 00:18:00.357 "num_base_bdevs_discovered": 2, 00:18:00.357 "num_base_bdevs_operational": 4, 00:18:00.357 "base_bdevs_list": [ 00:18:00.357 { 00:18:00.357 "name": null, 00:18:00.357 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:00.357 "is_configured": false, 00:18:00.357 "data_offset": 2048, 00:18:00.357 "data_size": 63488 00:18:00.357 }, 00:18:00.357 { 00:18:00.357 "name": null, 00:18:00.357 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:18:00.357 "is_configured": false, 00:18:00.357 "data_offset": 2048, 00:18:00.357 "data_size": 63488 00:18:00.357 }, 00:18:00.357 { 00:18:00.357 "name": "BaseBdev3", 00:18:00.357 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:18:00.357 "is_configured": true, 00:18:00.357 "data_offset": 2048, 00:18:00.357 "data_size": 63488 00:18:00.357 }, 00:18:00.357 { 00:18:00.357 "name": "BaseBdev4", 00:18:00.357 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:18:00.357 "is_configured": true, 00:18:00.357 "data_offset": 2048, 00:18:00.357 "data_size": 63488 00:18:00.357 } 00:18:00.357 ] 00:18:00.357 }' 00:18:00.357 13:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.357 13:36:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:00.931 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.931 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:01.193 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:01.193 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:01.450 [2024-07-15 13:36:40.697143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.451 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.730 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.730 "name": "Existed_Raid", 00:18:01.730 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:18:01.730 "strip_size_kb": 64, 00:18:01.730 "state": "configuring", 00:18:01.730 "raid_level": "raid0", 00:18:01.730 "superblock": true, 00:18:01.730 "num_base_bdevs": 4, 00:18:01.730 "num_base_bdevs_discovered": 3, 00:18:01.730 "num_base_bdevs_operational": 4, 00:18:01.730 "base_bdevs_list": [ 00:18:01.730 { 00:18:01.730 "name": null, 00:18:01.730 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:01.730 "is_configured": false, 00:18:01.730 "data_offset": 2048, 00:18:01.730 "data_size": 63488 00:18:01.730 }, 00:18:01.730 { 00:18:01.730 "name": "BaseBdev2", 00:18:01.730 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:18:01.730 "is_configured": true, 00:18:01.730 "data_offset": 2048, 00:18:01.730 "data_size": 63488 00:18:01.730 }, 00:18:01.730 { 00:18:01.730 "name": "BaseBdev3", 00:18:01.730 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:18:01.730 "is_configured": true, 00:18:01.730 "data_offset": 2048, 00:18:01.730 "data_size": 63488 00:18:01.730 }, 00:18:01.730 { 00:18:01.730 "name": "BaseBdev4", 00:18:01.730 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:18:01.730 "is_configured": true, 00:18:01.730 "data_offset": 2048, 00:18:01.730 "data_size": 63488 00:18:01.730 } 00:18:01.730 ] 00:18:01.730 }' 00:18:01.730 13:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.730 13:36:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:02.296 13:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.296 13:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:02.555 13:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:02.555 13:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.555 13:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:02.814 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f58eedf7-2194-44f3-964a-a30ca63d39f5 00:18:03.074 [2024-07-15 13:36:42.284871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:03.074 [2024-07-15 13:36:42.285035] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9d4470 00:18:03.074 [2024-07-15 13:36:42.285050] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:03.074 [2024-07-15 13:36:42.285224] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c4c40 00:18:03.074 [2024-07-15 13:36:42.285341] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9d4470 00:18:03.074 [2024-07-15 13:36:42.285350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9d4470 00:18:03.074 [2024-07-15 13:36:42.285440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:03.074 NewBaseBdev 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:03.074 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:03.333 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:03.592 [ 00:18:03.592 { 00:18:03.592 "name": "NewBaseBdev", 00:18:03.592 "aliases": [ 00:18:03.592 "f58eedf7-2194-44f3-964a-a30ca63d39f5" 00:18:03.592 ], 00:18:03.592 "product_name": "Malloc disk", 00:18:03.592 "block_size": 512, 00:18:03.592 "num_blocks": 65536, 00:18:03.592 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:03.592 "assigned_rate_limits": { 00:18:03.592 "rw_ios_per_sec": 0, 00:18:03.592 "rw_mbytes_per_sec": 0, 00:18:03.592 "r_mbytes_per_sec": 0, 00:18:03.592 "w_mbytes_per_sec": 0 00:18:03.592 }, 00:18:03.592 "claimed": true, 00:18:03.592 "claim_type": "exclusive_write", 00:18:03.592 "zoned": false, 00:18:03.592 "supported_io_types": { 00:18:03.592 "read": true, 00:18:03.592 "write": true, 00:18:03.592 "unmap": true, 00:18:03.592 "flush": true, 00:18:03.592 "reset": true, 00:18:03.592 "nvme_admin": false, 00:18:03.592 "nvme_io": false, 00:18:03.592 "nvme_io_md": false, 00:18:03.592 "write_zeroes": true, 00:18:03.592 "zcopy": true, 00:18:03.592 "get_zone_info": false, 00:18:03.592 "zone_management": false, 00:18:03.592 "zone_append": false, 00:18:03.592 "compare": false, 00:18:03.592 "compare_and_write": false, 00:18:03.592 "abort": true, 00:18:03.592 "seek_hole": false, 00:18:03.592 "seek_data": false, 00:18:03.592 "copy": true, 00:18:03.592 "nvme_iov_md": false 00:18:03.592 }, 00:18:03.592 "memory_domains": [ 00:18:03.592 { 00:18:03.592 "dma_device_id": "system", 00:18:03.592 "dma_device_type": 1 00:18:03.592 }, 00:18:03.592 { 00:18:03.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.592 "dma_device_type": 2 00:18:03.592 } 00:18:03.592 ], 00:18:03.592 "driver_specific": {} 00:18:03.592 } 00:18:03.592 ] 00:18:03.592 13:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:03.592 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:03.592 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.592 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.592 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.593 13:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.852 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.852 "name": "Existed_Raid", 00:18:03.852 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:18:03.852 "strip_size_kb": 64, 00:18:03.852 "state": "online", 00:18:03.852 "raid_level": "raid0", 00:18:03.852 "superblock": true, 00:18:03.852 "num_base_bdevs": 4, 00:18:03.852 "num_base_bdevs_discovered": 4, 00:18:03.852 "num_base_bdevs_operational": 4, 00:18:03.852 "base_bdevs_list": [ 00:18:03.852 { 00:18:03.852 "name": "NewBaseBdev", 00:18:03.852 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:03.852 "is_configured": true, 00:18:03.852 "data_offset": 2048, 00:18:03.852 "data_size": 63488 00:18:03.852 }, 00:18:03.852 { 00:18:03.852 "name": "BaseBdev2", 00:18:03.852 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:18:03.852 "is_configured": true, 00:18:03.852 "data_offset": 2048, 00:18:03.852 "data_size": 63488 00:18:03.852 }, 00:18:03.852 { 00:18:03.852 "name": "BaseBdev3", 00:18:03.852 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:18:03.852 "is_configured": true, 00:18:03.852 "data_offset": 2048, 00:18:03.852 "data_size": 63488 00:18:03.852 }, 00:18:03.852 { 00:18:03.852 "name": "BaseBdev4", 00:18:03.852 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:18:03.852 "is_configured": true, 00:18:03.852 "data_offset": 2048, 00:18:03.852 "data_size": 63488 00:18:03.852 } 00:18:03.852 ] 00:18:03.852 }' 00:18:03.852 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.852 13:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:04.418 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:04.678 [2024-07-15 13:36:43.853387] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:04.678 "name": "Existed_Raid", 00:18:04.678 "aliases": [ 00:18:04.678 "e6923525-dee9-4423-9e10-0c3cfded334b" 00:18:04.678 ], 00:18:04.678 "product_name": "Raid Volume", 00:18:04.678 "block_size": 512, 00:18:04.678 "num_blocks": 253952, 00:18:04.678 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:18:04.678 "assigned_rate_limits": { 00:18:04.678 "rw_ios_per_sec": 0, 00:18:04.678 "rw_mbytes_per_sec": 0, 00:18:04.678 "r_mbytes_per_sec": 0, 00:18:04.678 "w_mbytes_per_sec": 0 00:18:04.678 }, 00:18:04.678 "claimed": false, 00:18:04.678 "zoned": false, 00:18:04.678 "supported_io_types": { 00:18:04.678 "read": true, 00:18:04.678 "write": true, 00:18:04.678 "unmap": true, 00:18:04.678 "flush": true, 00:18:04.678 "reset": true, 00:18:04.678 "nvme_admin": false, 00:18:04.678 "nvme_io": false, 00:18:04.678 "nvme_io_md": false, 00:18:04.678 "write_zeroes": true, 00:18:04.678 "zcopy": false, 00:18:04.678 "get_zone_info": false, 00:18:04.678 "zone_management": false, 00:18:04.678 "zone_append": false, 00:18:04.678 "compare": false, 00:18:04.678 "compare_and_write": false, 00:18:04.678 "abort": false, 00:18:04.678 "seek_hole": false, 00:18:04.678 "seek_data": false, 00:18:04.678 "copy": false, 00:18:04.678 "nvme_iov_md": false 00:18:04.678 }, 00:18:04.678 "memory_domains": [ 00:18:04.678 { 00:18:04.678 "dma_device_id": "system", 00:18:04.678 "dma_device_type": 1 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.678 "dma_device_type": 2 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "system", 00:18:04.678 "dma_device_type": 1 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.678 "dma_device_type": 2 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "system", 00:18:04.678 "dma_device_type": 1 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.678 "dma_device_type": 2 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "system", 00:18:04.678 "dma_device_type": 1 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.678 "dma_device_type": 2 00:18:04.678 } 00:18:04.678 ], 00:18:04.678 "driver_specific": { 00:18:04.678 "raid": { 00:18:04.678 "uuid": "e6923525-dee9-4423-9e10-0c3cfded334b", 00:18:04.678 "strip_size_kb": 64, 00:18:04.678 "state": "online", 00:18:04.678 "raid_level": "raid0", 00:18:04.678 "superblock": true, 00:18:04.678 "num_base_bdevs": 4, 00:18:04.678 "num_base_bdevs_discovered": 4, 00:18:04.678 "num_base_bdevs_operational": 4, 00:18:04.678 "base_bdevs_list": [ 00:18:04.678 { 00:18:04.678 "name": "NewBaseBdev", 00:18:04.678 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:04.678 "is_configured": true, 00:18:04.678 "data_offset": 2048, 00:18:04.678 "data_size": 63488 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "name": "BaseBdev2", 00:18:04.678 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:18:04.678 "is_configured": true, 00:18:04.678 "data_offset": 2048, 00:18:04.678 "data_size": 63488 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "name": "BaseBdev3", 00:18:04.678 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:18:04.678 "is_configured": true, 00:18:04.678 "data_offset": 2048, 00:18:04.678 "data_size": 63488 00:18:04.678 }, 00:18:04.678 { 00:18:04.678 "name": "BaseBdev4", 00:18:04.678 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:18:04.678 "is_configured": true, 00:18:04.678 "data_offset": 2048, 00:18:04.678 "data_size": 63488 00:18:04.678 } 00:18:04.678 ] 00:18:04.678 } 00:18:04.678 } 00:18:04.678 }' 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:04.678 BaseBdev2 00:18:04.678 BaseBdev3 00:18:04.678 BaseBdev4' 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:04.678 13:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:04.938 "name": "NewBaseBdev", 00:18:04.938 "aliases": [ 00:18:04.938 "f58eedf7-2194-44f3-964a-a30ca63d39f5" 00:18:04.938 ], 00:18:04.938 "product_name": "Malloc disk", 00:18:04.938 "block_size": 512, 00:18:04.938 "num_blocks": 65536, 00:18:04.938 "uuid": "f58eedf7-2194-44f3-964a-a30ca63d39f5", 00:18:04.938 "assigned_rate_limits": { 00:18:04.938 "rw_ios_per_sec": 0, 00:18:04.938 "rw_mbytes_per_sec": 0, 00:18:04.938 "r_mbytes_per_sec": 0, 00:18:04.938 "w_mbytes_per_sec": 0 00:18:04.938 }, 00:18:04.938 "claimed": true, 00:18:04.938 "claim_type": "exclusive_write", 00:18:04.938 "zoned": false, 00:18:04.938 "supported_io_types": { 00:18:04.938 "read": true, 00:18:04.938 "write": true, 00:18:04.938 "unmap": true, 00:18:04.938 "flush": true, 00:18:04.938 "reset": true, 00:18:04.938 "nvme_admin": false, 00:18:04.938 "nvme_io": false, 00:18:04.938 "nvme_io_md": false, 00:18:04.938 "write_zeroes": true, 00:18:04.938 "zcopy": true, 00:18:04.938 "get_zone_info": false, 00:18:04.938 "zone_management": false, 00:18:04.938 "zone_append": false, 00:18:04.938 "compare": false, 00:18:04.938 "compare_and_write": false, 00:18:04.938 "abort": true, 00:18:04.938 "seek_hole": false, 00:18:04.938 "seek_data": false, 00:18:04.938 "copy": true, 00:18:04.938 "nvme_iov_md": false 00:18:04.938 }, 00:18:04.938 "memory_domains": [ 00:18:04.938 { 00:18:04.938 "dma_device_id": "system", 00:18:04.938 "dma_device_type": 1 00:18:04.938 }, 00:18:04.938 { 00:18:04.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.938 "dma_device_type": 2 00:18:04.938 } 00:18:04.938 ], 00:18:04.938 "driver_specific": {} 00:18:04.938 }' 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:04.938 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:05.197 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.456 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.456 "name": "BaseBdev2", 00:18:05.456 "aliases": [ 00:18:05.456 "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c" 00:18:05.456 ], 00:18:05.456 "product_name": "Malloc disk", 00:18:05.456 "block_size": 512, 00:18:05.456 "num_blocks": 65536, 00:18:05.456 "uuid": "9a9c5f3d-2241-44c4-9e9f-3919c3098e2c", 00:18:05.456 "assigned_rate_limits": { 00:18:05.456 "rw_ios_per_sec": 0, 00:18:05.456 "rw_mbytes_per_sec": 0, 00:18:05.456 "r_mbytes_per_sec": 0, 00:18:05.456 "w_mbytes_per_sec": 0 00:18:05.456 }, 00:18:05.456 "claimed": true, 00:18:05.456 "claim_type": "exclusive_write", 00:18:05.456 "zoned": false, 00:18:05.456 "supported_io_types": { 00:18:05.456 "read": true, 00:18:05.456 "write": true, 00:18:05.456 "unmap": true, 00:18:05.456 "flush": true, 00:18:05.456 "reset": true, 00:18:05.456 "nvme_admin": false, 00:18:05.456 "nvme_io": false, 00:18:05.456 "nvme_io_md": false, 00:18:05.456 "write_zeroes": true, 00:18:05.456 "zcopy": true, 00:18:05.456 "get_zone_info": false, 00:18:05.456 "zone_management": false, 00:18:05.456 "zone_append": false, 00:18:05.456 "compare": false, 00:18:05.456 "compare_and_write": false, 00:18:05.456 "abort": true, 00:18:05.456 "seek_hole": false, 00:18:05.456 "seek_data": false, 00:18:05.456 "copy": true, 00:18:05.456 "nvme_iov_md": false 00:18:05.456 }, 00:18:05.456 "memory_domains": [ 00:18:05.456 { 00:18:05.456 "dma_device_id": "system", 00:18:05.456 "dma_device_type": 1 00:18:05.456 }, 00:18:05.456 { 00:18:05.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.456 "dma_device_type": 2 00:18:05.456 } 00:18:05.456 ], 00:18:05.456 "driver_specific": {} 00:18:05.456 }' 00:18:05.456 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.456 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.457 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.457 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.715 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.715 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.715 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.715 13:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:05.715 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.974 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.974 "name": "BaseBdev3", 00:18:05.974 "aliases": [ 00:18:05.974 "bd08601d-4f08-4bc2-bf71-5be7222ae9a9" 00:18:05.974 ], 00:18:05.974 "product_name": "Malloc disk", 00:18:05.974 "block_size": 512, 00:18:05.974 "num_blocks": 65536, 00:18:05.974 "uuid": "bd08601d-4f08-4bc2-bf71-5be7222ae9a9", 00:18:05.974 "assigned_rate_limits": { 00:18:05.974 "rw_ios_per_sec": 0, 00:18:05.974 "rw_mbytes_per_sec": 0, 00:18:05.974 "r_mbytes_per_sec": 0, 00:18:05.974 "w_mbytes_per_sec": 0 00:18:05.974 }, 00:18:05.974 "claimed": true, 00:18:05.974 "claim_type": "exclusive_write", 00:18:05.974 "zoned": false, 00:18:05.974 "supported_io_types": { 00:18:05.975 "read": true, 00:18:05.975 "write": true, 00:18:05.975 "unmap": true, 00:18:05.975 "flush": true, 00:18:05.975 "reset": true, 00:18:05.975 "nvme_admin": false, 00:18:05.975 "nvme_io": false, 00:18:05.975 "nvme_io_md": false, 00:18:05.975 "write_zeroes": true, 00:18:05.975 "zcopy": true, 00:18:05.975 "get_zone_info": false, 00:18:05.975 "zone_management": false, 00:18:05.975 "zone_append": false, 00:18:05.975 "compare": false, 00:18:05.975 "compare_and_write": false, 00:18:05.975 "abort": true, 00:18:05.975 "seek_hole": false, 00:18:05.975 "seek_data": false, 00:18:05.975 "copy": true, 00:18:05.975 "nvme_iov_md": false 00:18:05.975 }, 00:18:05.975 "memory_domains": [ 00:18:05.975 { 00:18:05.975 "dma_device_id": "system", 00:18:05.975 "dma_device_type": 1 00:18:05.975 }, 00:18:05.975 { 00:18:05.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.975 "dma_device_type": 2 00:18:05.975 } 00:18:05.975 ], 00:18:05.975 "driver_specific": {} 00:18:05.975 }' 00:18:05.975 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:06.233 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.492 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.492 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.492 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:06.492 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:06.492 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:06.750 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:06.750 "name": "BaseBdev4", 00:18:06.750 "aliases": [ 00:18:06.750 "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19" 00:18:06.750 ], 00:18:06.750 "product_name": "Malloc disk", 00:18:06.750 "block_size": 512, 00:18:06.750 "num_blocks": 65536, 00:18:06.750 "uuid": "b7d19e6d-95ed-42d9-9aeb-8ab96d52ef19", 00:18:06.750 "assigned_rate_limits": { 00:18:06.750 "rw_ios_per_sec": 0, 00:18:06.750 "rw_mbytes_per_sec": 0, 00:18:06.750 "r_mbytes_per_sec": 0, 00:18:06.750 "w_mbytes_per_sec": 0 00:18:06.750 }, 00:18:06.750 "claimed": true, 00:18:06.750 "claim_type": "exclusive_write", 00:18:06.750 "zoned": false, 00:18:06.751 "supported_io_types": { 00:18:06.751 "read": true, 00:18:06.751 "write": true, 00:18:06.751 "unmap": true, 00:18:06.751 "flush": true, 00:18:06.751 "reset": true, 00:18:06.751 "nvme_admin": false, 00:18:06.751 "nvme_io": false, 00:18:06.751 "nvme_io_md": false, 00:18:06.751 "write_zeroes": true, 00:18:06.751 "zcopy": true, 00:18:06.751 "get_zone_info": false, 00:18:06.751 "zone_management": false, 00:18:06.751 "zone_append": false, 00:18:06.751 "compare": false, 00:18:06.751 "compare_and_write": false, 00:18:06.751 "abort": true, 00:18:06.751 "seek_hole": false, 00:18:06.751 "seek_data": false, 00:18:06.751 "copy": true, 00:18:06.751 "nvme_iov_md": false 00:18:06.751 }, 00:18:06.751 "memory_domains": [ 00:18:06.751 { 00:18:06.751 "dma_device_id": "system", 00:18:06.751 "dma_device_type": 1 00:18:06.751 }, 00:18:06.751 { 00:18:06.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.751 "dma_device_type": 2 00:18:06.751 } 00:18:06.751 ], 00:18:06.751 "driver_specific": {} 00:18:06.751 }' 00:18:06.751 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.751 13:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.751 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.751 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.751 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.751 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.751 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:07.009 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:07.268 [2024-07-15 13:36:46.528154] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:07.268 [2024-07-15 13:36:46.528180] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:07.268 [2024-07-15 13:36:46.528232] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:07.269 [2024-07-15 13:36:46.528296] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:07.269 [2024-07-15 13:36:46.528309] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9d4470 name Existed_Raid, state offline 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2133212 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2133212 ']' 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2133212 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2133212 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2133212' 00:18:07.269 killing process with pid 2133212 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2133212 00:18:07.269 [2024-07-15 13:36:46.597829] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:07.269 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2133212 00:18:07.269 [2024-07-15 13:36:46.635866] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:07.528 13:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:07.528 00:18:07.528 real 0m32.078s 00:18:07.528 user 0m59.492s 00:18:07.528 sys 0m5.643s 00:18:07.528 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:07.528 13:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.528 ************************************ 00:18:07.528 END TEST raid_state_function_test_sb 00:18:07.528 ************************************ 00:18:07.528 13:36:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:07.528 13:36:46 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:07.528 13:36:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:07.528 13:36:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:07.528 13:36:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:07.528 ************************************ 00:18:07.528 START TEST raid_superblock_test 00:18:07.528 ************************************ 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2138107 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2138107 /var/tmp/spdk-raid.sock 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2138107 ']' 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:07.528 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:07.528 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.787 [2024-07-15 13:36:46.987640] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:18:07.787 [2024-07-15 13:36:46.987710] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2138107 ] 00:18:07.787 [2024-07-15 13:36:47.116877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:08.046 [2024-07-15 13:36:47.219169] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:08.046 [2024-07-15 13:36:47.282020] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.046 [2024-07-15 13:36:47.282060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:08.613 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:08.871 malloc1 00:18:08.871 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:09.130 [2024-07-15 13:36:48.411340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:09.130 [2024-07-15 13:36:48.411389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.130 [2024-07-15 13:36:48.411411] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e2570 00:18:09.130 [2024-07-15 13:36:48.411424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.130 [2024-07-15 13:36:48.413169] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.130 [2024-07-15 13:36:48.413197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:09.130 pt1 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:09.130 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:09.388 malloc2 00:18:09.388 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:09.646 [2024-07-15 13:36:48.906710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:09.646 [2024-07-15 13:36:48.906750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.646 [2024-07-15 13:36:48.906774] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e3970 00:18:09.646 [2024-07-15 13:36:48.906787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.646 [2024-07-15 13:36:48.908403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.646 [2024-07-15 13:36:48.908430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:09.647 pt2 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:09.647 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:09.905 malloc3 00:18:09.905 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:10.163 [2024-07-15 13:36:49.400634] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:10.163 [2024-07-15 13:36:49.400680] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:10.163 [2024-07-15 13:36:49.400698] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227a340 00:18:10.163 [2024-07-15 13:36:49.400710] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:10.163 [2024-07-15 13:36:49.402256] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:10.163 [2024-07-15 13:36:49.402284] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:10.163 pt3 00:18:10.163 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:10.164 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:10.422 malloc4 00:18:10.422 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:10.680 [2024-07-15 13:36:49.887716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:10.680 [2024-07-15 13:36:49.887761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:10.680 [2024-07-15 13:36:49.887783] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227cc60 00:18:10.680 [2024-07-15 13:36:49.887795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:10.680 [2024-07-15 13:36:49.889367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:10.680 [2024-07-15 13:36:49.889400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:10.680 pt4 00:18:10.680 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:10.680 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:10.680 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:10.937 [2024-07-15 13:36:50.136410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:10.937 [2024-07-15 13:36:50.137810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:10.937 [2024-07-15 13:36:50.137868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:10.937 [2024-07-15 13:36:50.137912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:10.937 [2024-07-15 13:36:50.138098] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20da530 00:18:10.937 [2024-07-15 13:36:50.138110] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:10.937 [2024-07-15 13:36:50.138317] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d8770 00:18:10.937 [2024-07-15 13:36:50.138468] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20da530 00:18:10.937 [2024-07-15 13:36:50.138478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20da530 00:18:10.937 [2024-07-15 13:36:50.138581] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.937 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:11.194 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.194 "name": "raid_bdev1", 00:18:11.194 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:11.194 "strip_size_kb": 64, 00:18:11.194 "state": "online", 00:18:11.194 "raid_level": "raid0", 00:18:11.194 "superblock": true, 00:18:11.194 "num_base_bdevs": 4, 00:18:11.194 "num_base_bdevs_discovered": 4, 00:18:11.194 "num_base_bdevs_operational": 4, 00:18:11.194 "base_bdevs_list": [ 00:18:11.194 { 00:18:11.194 "name": "pt1", 00:18:11.194 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:11.194 "is_configured": true, 00:18:11.194 "data_offset": 2048, 00:18:11.194 "data_size": 63488 00:18:11.194 }, 00:18:11.194 { 00:18:11.194 "name": "pt2", 00:18:11.194 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:11.194 "is_configured": true, 00:18:11.194 "data_offset": 2048, 00:18:11.194 "data_size": 63488 00:18:11.194 }, 00:18:11.194 { 00:18:11.194 "name": "pt3", 00:18:11.194 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:11.194 "is_configured": true, 00:18:11.194 "data_offset": 2048, 00:18:11.194 "data_size": 63488 00:18:11.194 }, 00:18:11.194 { 00:18:11.194 "name": "pt4", 00:18:11.194 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:11.194 "is_configured": true, 00:18:11.194 "data_offset": 2048, 00:18:11.194 "data_size": 63488 00:18:11.194 } 00:18:11.194 ] 00:18:11.194 }' 00:18:11.194 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.194 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:11.759 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:11.759 [2024-07-15 13:36:51.175448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:12.017 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:12.017 "name": "raid_bdev1", 00:18:12.017 "aliases": [ 00:18:12.017 "7fc7bdef-e7eb-4756-8e91-744b785e0ec0" 00:18:12.017 ], 00:18:12.017 "product_name": "Raid Volume", 00:18:12.017 "block_size": 512, 00:18:12.017 "num_blocks": 253952, 00:18:12.017 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:12.017 "assigned_rate_limits": { 00:18:12.017 "rw_ios_per_sec": 0, 00:18:12.017 "rw_mbytes_per_sec": 0, 00:18:12.017 "r_mbytes_per_sec": 0, 00:18:12.017 "w_mbytes_per_sec": 0 00:18:12.017 }, 00:18:12.017 "claimed": false, 00:18:12.017 "zoned": false, 00:18:12.017 "supported_io_types": { 00:18:12.017 "read": true, 00:18:12.017 "write": true, 00:18:12.017 "unmap": true, 00:18:12.017 "flush": true, 00:18:12.017 "reset": true, 00:18:12.017 "nvme_admin": false, 00:18:12.017 "nvme_io": false, 00:18:12.017 "nvme_io_md": false, 00:18:12.017 "write_zeroes": true, 00:18:12.017 "zcopy": false, 00:18:12.017 "get_zone_info": false, 00:18:12.017 "zone_management": false, 00:18:12.017 "zone_append": false, 00:18:12.017 "compare": false, 00:18:12.017 "compare_and_write": false, 00:18:12.017 "abort": false, 00:18:12.017 "seek_hole": false, 00:18:12.017 "seek_data": false, 00:18:12.017 "copy": false, 00:18:12.017 "nvme_iov_md": false 00:18:12.017 }, 00:18:12.017 "memory_domains": [ 00:18:12.017 { 00:18:12.017 "dma_device_id": "system", 00:18:12.017 "dma_device_type": 1 00:18:12.017 }, 00:18:12.017 { 00:18:12.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.017 "dma_device_type": 2 00:18:12.017 }, 00:18:12.017 { 00:18:12.017 "dma_device_id": "system", 00:18:12.017 "dma_device_type": 1 00:18:12.017 }, 00:18:12.017 { 00:18:12.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.017 "dma_device_type": 2 00:18:12.017 }, 00:18:12.017 { 00:18:12.017 "dma_device_id": "system", 00:18:12.017 "dma_device_type": 1 00:18:12.017 }, 00:18:12.017 { 00:18:12.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.018 "dma_device_type": 2 00:18:12.018 }, 00:18:12.018 { 00:18:12.018 "dma_device_id": "system", 00:18:12.018 "dma_device_type": 1 00:18:12.018 }, 00:18:12.018 { 00:18:12.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.018 "dma_device_type": 2 00:18:12.018 } 00:18:12.018 ], 00:18:12.018 "driver_specific": { 00:18:12.018 "raid": { 00:18:12.018 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:12.018 "strip_size_kb": 64, 00:18:12.018 "state": "online", 00:18:12.018 "raid_level": "raid0", 00:18:12.018 "superblock": true, 00:18:12.018 "num_base_bdevs": 4, 00:18:12.018 "num_base_bdevs_discovered": 4, 00:18:12.018 "num_base_bdevs_operational": 4, 00:18:12.018 "base_bdevs_list": [ 00:18:12.018 { 00:18:12.018 "name": "pt1", 00:18:12.018 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:12.018 "is_configured": true, 00:18:12.018 "data_offset": 2048, 00:18:12.018 "data_size": 63488 00:18:12.018 }, 00:18:12.018 { 00:18:12.018 "name": "pt2", 00:18:12.018 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:12.018 "is_configured": true, 00:18:12.018 "data_offset": 2048, 00:18:12.018 "data_size": 63488 00:18:12.018 }, 00:18:12.018 { 00:18:12.018 "name": "pt3", 00:18:12.018 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:12.018 "is_configured": true, 00:18:12.018 "data_offset": 2048, 00:18:12.018 "data_size": 63488 00:18:12.018 }, 00:18:12.018 { 00:18:12.018 "name": "pt4", 00:18:12.018 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:12.018 "is_configured": true, 00:18:12.018 "data_offset": 2048, 00:18:12.018 "data_size": 63488 00:18:12.018 } 00:18:12.018 ] 00:18:12.018 } 00:18:12.018 } 00:18:12.018 }' 00:18:12.018 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:12.018 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:12.018 pt2 00:18:12.018 pt3 00:18:12.018 pt4' 00:18:12.018 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:12.018 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:12.018 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:12.342 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:12.342 "name": "pt1", 00:18:12.342 "aliases": [ 00:18:12.342 "00000000-0000-0000-0000-000000000001" 00:18:12.342 ], 00:18:12.342 "product_name": "passthru", 00:18:12.342 "block_size": 512, 00:18:12.342 "num_blocks": 65536, 00:18:12.342 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:12.342 "assigned_rate_limits": { 00:18:12.342 "rw_ios_per_sec": 0, 00:18:12.342 "rw_mbytes_per_sec": 0, 00:18:12.342 "r_mbytes_per_sec": 0, 00:18:12.342 "w_mbytes_per_sec": 0 00:18:12.342 }, 00:18:12.342 "claimed": true, 00:18:12.342 "claim_type": "exclusive_write", 00:18:12.342 "zoned": false, 00:18:12.342 "supported_io_types": { 00:18:12.342 "read": true, 00:18:12.342 "write": true, 00:18:12.342 "unmap": true, 00:18:12.342 "flush": true, 00:18:12.342 "reset": true, 00:18:12.342 "nvme_admin": false, 00:18:12.342 "nvme_io": false, 00:18:12.342 "nvme_io_md": false, 00:18:12.342 "write_zeroes": true, 00:18:12.342 "zcopy": true, 00:18:12.342 "get_zone_info": false, 00:18:12.342 "zone_management": false, 00:18:12.342 "zone_append": false, 00:18:12.342 "compare": false, 00:18:12.342 "compare_and_write": false, 00:18:12.342 "abort": true, 00:18:12.342 "seek_hole": false, 00:18:12.342 "seek_data": false, 00:18:12.342 "copy": true, 00:18:12.342 "nvme_iov_md": false 00:18:12.342 }, 00:18:12.342 "memory_domains": [ 00:18:12.342 { 00:18:12.343 "dma_device_id": "system", 00:18:12.343 "dma_device_type": 1 00:18:12.343 }, 00:18:12.343 { 00:18:12.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.343 "dma_device_type": 2 00:18:12.343 } 00:18:12.343 ], 00:18:12.343 "driver_specific": { 00:18:12.343 "passthru": { 00:18:12.343 "name": "pt1", 00:18:12.343 "base_bdev_name": "malloc1" 00:18:12.343 } 00:18:12.343 } 00:18:12.343 }' 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:12.343 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:12.605 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:12.605 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:12.605 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:12.605 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:12.605 13:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:12.862 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:12.862 "name": "pt2", 00:18:12.862 "aliases": [ 00:18:12.862 "00000000-0000-0000-0000-000000000002" 00:18:12.862 ], 00:18:12.862 "product_name": "passthru", 00:18:12.862 "block_size": 512, 00:18:12.862 "num_blocks": 65536, 00:18:12.862 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:12.862 "assigned_rate_limits": { 00:18:12.862 "rw_ios_per_sec": 0, 00:18:12.862 "rw_mbytes_per_sec": 0, 00:18:12.862 "r_mbytes_per_sec": 0, 00:18:12.862 "w_mbytes_per_sec": 0 00:18:12.862 }, 00:18:12.862 "claimed": true, 00:18:12.862 "claim_type": "exclusive_write", 00:18:12.863 "zoned": false, 00:18:12.863 "supported_io_types": { 00:18:12.863 "read": true, 00:18:12.863 "write": true, 00:18:12.863 "unmap": true, 00:18:12.863 "flush": true, 00:18:12.863 "reset": true, 00:18:12.863 "nvme_admin": false, 00:18:12.863 "nvme_io": false, 00:18:12.863 "nvme_io_md": false, 00:18:12.863 "write_zeroes": true, 00:18:12.863 "zcopy": true, 00:18:12.863 "get_zone_info": false, 00:18:12.863 "zone_management": false, 00:18:12.863 "zone_append": false, 00:18:12.863 "compare": false, 00:18:12.863 "compare_and_write": false, 00:18:12.863 "abort": true, 00:18:12.863 "seek_hole": false, 00:18:12.863 "seek_data": false, 00:18:12.863 "copy": true, 00:18:12.863 "nvme_iov_md": false 00:18:12.863 }, 00:18:12.863 "memory_domains": [ 00:18:12.863 { 00:18:12.863 "dma_device_id": "system", 00:18:12.863 "dma_device_type": 1 00:18:12.863 }, 00:18:12.863 { 00:18:12.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.863 "dma_device_type": 2 00:18:12.863 } 00:18:12.863 ], 00:18:12.863 "driver_specific": { 00:18:12.863 "passthru": { 00:18:12.863 "name": "pt2", 00:18:12.863 "base_bdev_name": "malloc2" 00:18:12.863 } 00:18:12.863 } 00:18:12.863 }' 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:12.863 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:13.121 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:13.379 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:13.379 "name": "pt3", 00:18:13.379 "aliases": [ 00:18:13.379 "00000000-0000-0000-0000-000000000003" 00:18:13.379 ], 00:18:13.379 "product_name": "passthru", 00:18:13.379 "block_size": 512, 00:18:13.379 "num_blocks": 65536, 00:18:13.379 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:13.379 "assigned_rate_limits": { 00:18:13.379 "rw_ios_per_sec": 0, 00:18:13.379 "rw_mbytes_per_sec": 0, 00:18:13.379 "r_mbytes_per_sec": 0, 00:18:13.379 "w_mbytes_per_sec": 0 00:18:13.379 }, 00:18:13.379 "claimed": true, 00:18:13.379 "claim_type": "exclusive_write", 00:18:13.379 "zoned": false, 00:18:13.379 "supported_io_types": { 00:18:13.379 "read": true, 00:18:13.379 "write": true, 00:18:13.379 "unmap": true, 00:18:13.379 "flush": true, 00:18:13.379 "reset": true, 00:18:13.379 "nvme_admin": false, 00:18:13.379 "nvme_io": false, 00:18:13.379 "nvme_io_md": false, 00:18:13.379 "write_zeroes": true, 00:18:13.379 "zcopy": true, 00:18:13.379 "get_zone_info": false, 00:18:13.379 "zone_management": false, 00:18:13.379 "zone_append": false, 00:18:13.379 "compare": false, 00:18:13.379 "compare_and_write": false, 00:18:13.379 "abort": true, 00:18:13.379 "seek_hole": false, 00:18:13.379 "seek_data": false, 00:18:13.379 "copy": true, 00:18:13.379 "nvme_iov_md": false 00:18:13.379 }, 00:18:13.379 "memory_domains": [ 00:18:13.379 { 00:18:13.379 "dma_device_id": "system", 00:18:13.379 "dma_device_type": 1 00:18:13.379 }, 00:18:13.379 { 00:18:13.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.379 "dma_device_type": 2 00:18:13.379 } 00:18:13.379 ], 00:18:13.379 "driver_specific": { 00:18:13.379 "passthru": { 00:18:13.379 "name": "pt3", 00:18:13.379 "base_bdev_name": "malloc3" 00:18:13.379 } 00:18:13.379 } 00:18:13.379 }' 00:18:13.379 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.379 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.379 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:13.379 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.637 13:36:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:13.637 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:13.637 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.637 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:13.637 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:13.894 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:13.894 "name": "pt4", 00:18:13.894 "aliases": [ 00:18:13.894 "00000000-0000-0000-0000-000000000004" 00:18:13.894 ], 00:18:13.894 "product_name": "passthru", 00:18:13.894 "block_size": 512, 00:18:13.894 "num_blocks": 65536, 00:18:13.894 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:13.894 "assigned_rate_limits": { 00:18:13.894 "rw_ios_per_sec": 0, 00:18:13.894 "rw_mbytes_per_sec": 0, 00:18:13.894 "r_mbytes_per_sec": 0, 00:18:13.894 "w_mbytes_per_sec": 0 00:18:13.894 }, 00:18:13.894 "claimed": true, 00:18:13.894 "claim_type": "exclusive_write", 00:18:13.894 "zoned": false, 00:18:13.894 "supported_io_types": { 00:18:13.894 "read": true, 00:18:13.894 "write": true, 00:18:13.894 "unmap": true, 00:18:13.894 "flush": true, 00:18:13.894 "reset": true, 00:18:13.894 "nvme_admin": false, 00:18:13.894 "nvme_io": false, 00:18:13.894 "nvme_io_md": false, 00:18:13.894 "write_zeroes": true, 00:18:13.894 "zcopy": true, 00:18:13.894 "get_zone_info": false, 00:18:13.894 "zone_management": false, 00:18:13.894 "zone_append": false, 00:18:13.894 "compare": false, 00:18:13.894 "compare_and_write": false, 00:18:13.894 "abort": true, 00:18:13.894 "seek_hole": false, 00:18:13.894 "seek_data": false, 00:18:13.894 "copy": true, 00:18:13.894 "nvme_iov_md": false 00:18:13.894 }, 00:18:13.894 "memory_domains": [ 00:18:13.894 { 00:18:13.894 "dma_device_id": "system", 00:18:13.894 "dma_device_type": 1 00:18:13.894 }, 00:18:13.894 { 00:18:13.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.894 "dma_device_type": 2 00:18:13.894 } 00:18:13.894 ], 00:18:13.894 "driver_specific": { 00:18:13.894 "passthru": { 00:18:13.894 "name": "pt4", 00:18:13.894 "base_bdev_name": "malloc4" 00:18:13.894 } 00:18:13.894 } 00:18:13.894 }' 00:18:13.894 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.894 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.151 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.409 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.409 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:14.409 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:14.667 [2024-07-15 13:36:53.842519] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:14.667 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7fc7bdef-e7eb-4756-8e91-744b785e0ec0 00:18:14.667 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7fc7bdef-e7eb-4756-8e91-744b785e0ec0 ']' 00:18:14.667 13:36:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:14.667 [2024-07-15 13:36:54.090869] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:14.667 [2024-07-15 13:36:54.090892] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:14.667 [2024-07-15 13:36:54.090955] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:14.667 [2024-07-15 13:36:54.091021] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:14.667 [2024-07-15 13:36:54.091033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20da530 name raid_bdev1, state offline 00:18:14.925 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.925 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:15.182 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:15.439 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:15.439 13:36:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:15.697 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:15.697 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:15.954 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:15.954 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:16.212 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:16.469 [2024-07-15 13:36:55.743184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:16.469 [2024-07-15 13:36:55.744567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:16.469 [2024-07-15 13:36:55.744611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:16.469 [2024-07-15 13:36:55.744651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:16.469 [2024-07-15 13:36:55.744699] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:16.469 [2024-07-15 13:36:55.744737] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:16.469 [2024-07-15 13:36:55.744759] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:16.469 [2024-07-15 13:36:55.744782] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:16.469 [2024-07-15 13:36:55.744800] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:16.469 [2024-07-15 13:36:55.744811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2285ff0 name raid_bdev1, state configuring 00:18:16.470 request: 00:18:16.470 { 00:18:16.470 "name": "raid_bdev1", 00:18:16.470 "raid_level": "raid0", 00:18:16.470 "base_bdevs": [ 00:18:16.470 "malloc1", 00:18:16.470 "malloc2", 00:18:16.470 "malloc3", 00:18:16.470 "malloc4" 00:18:16.470 ], 00:18:16.470 "strip_size_kb": 64, 00:18:16.470 "superblock": false, 00:18:16.470 "method": "bdev_raid_create", 00:18:16.470 "req_id": 1 00:18:16.470 } 00:18:16.470 Got JSON-RPC error response 00:18:16.470 response: 00:18:16.470 { 00:18:16.470 "code": -17, 00:18:16.470 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:16.470 } 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:16.470 13:36:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.727 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:16.727 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:16.727 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:16.985 [2024-07-15 13:36:56.232428] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:16.985 [2024-07-15 13:36:56.232474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:16.985 [2024-07-15 13:36:56.232496] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e27a0 00:18:16.985 [2024-07-15 13:36:56.232509] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:16.985 [2024-07-15 13:36:56.234148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:16.985 [2024-07-15 13:36:56.234176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:16.985 [2024-07-15 13:36:56.234243] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:16.985 [2024-07-15 13:36:56.234272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:16.985 pt1 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.985 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.243 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.243 "name": "raid_bdev1", 00:18:17.243 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:17.243 "strip_size_kb": 64, 00:18:17.243 "state": "configuring", 00:18:17.243 "raid_level": "raid0", 00:18:17.243 "superblock": true, 00:18:17.243 "num_base_bdevs": 4, 00:18:17.243 "num_base_bdevs_discovered": 1, 00:18:17.243 "num_base_bdevs_operational": 4, 00:18:17.243 "base_bdevs_list": [ 00:18:17.243 { 00:18:17.243 "name": "pt1", 00:18:17.243 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:17.243 "is_configured": true, 00:18:17.243 "data_offset": 2048, 00:18:17.243 "data_size": 63488 00:18:17.243 }, 00:18:17.243 { 00:18:17.243 "name": null, 00:18:17.243 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:17.243 "is_configured": false, 00:18:17.243 "data_offset": 2048, 00:18:17.243 "data_size": 63488 00:18:17.243 }, 00:18:17.243 { 00:18:17.243 "name": null, 00:18:17.243 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:17.243 "is_configured": false, 00:18:17.243 "data_offset": 2048, 00:18:17.243 "data_size": 63488 00:18:17.243 }, 00:18:17.243 { 00:18:17.243 "name": null, 00:18:17.243 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:17.243 "is_configured": false, 00:18:17.243 "data_offset": 2048, 00:18:17.243 "data_size": 63488 00:18:17.243 } 00:18:17.243 ] 00:18:17.243 }' 00:18:17.243 13:36:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.243 13:36:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.809 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:17.809 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:18.066 [2024-07-15 13:36:57.323331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:18.066 [2024-07-15 13:36:57.323378] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.066 [2024-07-15 13:36:57.323397] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227b940 00:18:18.066 [2024-07-15 13:36:57.323410] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.067 [2024-07-15 13:36:57.323741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.067 [2024-07-15 13:36:57.323758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:18.067 [2024-07-15 13:36:57.323814] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:18.067 [2024-07-15 13:36:57.323832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:18.067 pt2 00:18:18.067 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:18.325 [2024-07-15 13:36:57.499809] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.325 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:18.583 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.583 "name": "raid_bdev1", 00:18:18.583 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:18.583 "strip_size_kb": 64, 00:18:18.583 "state": "configuring", 00:18:18.583 "raid_level": "raid0", 00:18:18.583 "superblock": true, 00:18:18.583 "num_base_bdevs": 4, 00:18:18.583 "num_base_bdevs_discovered": 1, 00:18:18.583 "num_base_bdevs_operational": 4, 00:18:18.583 "base_bdevs_list": [ 00:18:18.583 { 00:18:18.583 "name": "pt1", 00:18:18.583 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:18.583 "is_configured": true, 00:18:18.583 "data_offset": 2048, 00:18:18.583 "data_size": 63488 00:18:18.583 }, 00:18:18.583 { 00:18:18.583 "name": null, 00:18:18.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:18.583 "is_configured": false, 00:18:18.583 "data_offset": 2048, 00:18:18.583 "data_size": 63488 00:18:18.583 }, 00:18:18.583 { 00:18:18.583 "name": null, 00:18:18.583 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:18.583 "is_configured": false, 00:18:18.583 "data_offset": 2048, 00:18:18.583 "data_size": 63488 00:18:18.583 }, 00:18:18.583 { 00:18:18.583 "name": null, 00:18:18.583 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:18.583 "is_configured": false, 00:18:18.583 "data_offset": 2048, 00:18:18.583 "data_size": 63488 00:18:18.583 } 00:18:18.583 ] 00:18:18.583 }' 00:18:18.583 13:36:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.583 13:36:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.149 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:19.149 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:19.149 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:19.408 [2024-07-15 13:36:58.590715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:19.408 [2024-07-15 13:36:58.590763] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.408 [2024-07-15 13:36:58.590782] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9060 00:18:19.408 [2024-07-15 13:36:58.590795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.408 [2024-07-15 13:36:58.591142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.408 [2024-07-15 13:36:58.591159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:19.408 [2024-07-15 13:36:58.591219] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:19.408 [2024-07-15 13:36:58.591238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:19.408 pt2 00:18:19.408 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:19.408 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:19.408 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:19.667 [2024-07-15 13:36:58.835373] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:19.667 [2024-07-15 13:36:58.835411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.667 [2024-07-15 13:36:58.835431] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20db8d0 00:18:19.667 [2024-07-15 13:36:58.835443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.667 [2024-07-15 13:36:58.835745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.667 [2024-07-15 13:36:58.835762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:19.667 [2024-07-15 13:36:58.835816] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:19.667 [2024-07-15 13:36:58.835834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:19.667 pt3 00:18:19.667 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:19.667 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:19.667 13:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:19.667 [2024-07-15 13:36:59.084035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:19.667 [2024-07-15 13:36:59.084076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.667 [2024-07-15 13:36:59.084092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20dcb80 00:18:19.667 [2024-07-15 13:36:59.084105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.667 [2024-07-15 13:36:59.084384] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.667 [2024-07-15 13:36:59.084402] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:19.667 [2024-07-15 13:36:59.084451] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:19.667 [2024-07-15 13:36:59.084468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:19.667 [2024-07-15 13:36:59.084583] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20d9780 00:18:19.667 [2024-07-15 13:36:59.084593] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:19.667 [2024-07-15 13:36:59.084758] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ded70 00:18:19.667 [2024-07-15 13:36:59.084884] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20d9780 00:18:19.667 [2024-07-15 13:36:59.084893] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20d9780 00:18:19.667 [2024-07-15 13:36:59.084999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:19.667 pt4 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.926 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.184 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.184 "name": "raid_bdev1", 00:18:20.184 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:20.184 "strip_size_kb": 64, 00:18:20.184 "state": "online", 00:18:20.184 "raid_level": "raid0", 00:18:20.184 "superblock": true, 00:18:20.184 "num_base_bdevs": 4, 00:18:20.184 "num_base_bdevs_discovered": 4, 00:18:20.184 "num_base_bdevs_operational": 4, 00:18:20.184 "base_bdevs_list": [ 00:18:20.184 { 00:18:20.184 "name": "pt1", 00:18:20.184 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:20.184 "is_configured": true, 00:18:20.184 "data_offset": 2048, 00:18:20.184 "data_size": 63488 00:18:20.184 }, 00:18:20.184 { 00:18:20.184 "name": "pt2", 00:18:20.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:20.184 "is_configured": true, 00:18:20.184 "data_offset": 2048, 00:18:20.184 "data_size": 63488 00:18:20.184 }, 00:18:20.184 { 00:18:20.184 "name": "pt3", 00:18:20.184 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:20.184 "is_configured": true, 00:18:20.184 "data_offset": 2048, 00:18:20.184 "data_size": 63488 00:18:20.184 }, 00:18:20.184 { 00:18:20.184 "name": "pt4", 00:18:20.184 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:20.184 "is_configured": true, 00:18:20.184 "data_offset": 2048, 00:18:20.184 "data_size": 63488 00:18:20.184 } 00:18:20.184 ] 00:18:20.184 }' 00:18:20.184 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.184 13:36:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:20.749 13:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:21.007 [2024-07-15 13:37:00.183313] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:21.007 "name": "raid_bdev1", 00:18:21.007 "aliases": [ 00:18:21.007 "7fc7bdef-e7eb-4756-8e91-744b785e0ec0" 00:18:21.007 ], 00:18:21.007 "product_name": "Raid Volume", 00:18:21.007 "block_size": 512, 00:18:21.007 "num_blocks": 253952, 00:18:21.007 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:21.007 "assigned_rate_limits": { 00:18:21.007 "rw_ios_per_sec": 0, 00:18:21.007 "rw_mbytes_per_sec": 0, 00:18:21.007 "r_mbytes_per_sec": 0, 00:18:21.007 "w_mbytes_per_sec": 0 00:18:21.007 }, 00:18:21.007 "claimed": false, 00:18:21.007 "zoned": false, 00:18:21.007 "supported_io_types": { 00:18:21.007 "read": true, 00:18:21.007 "write": true, 00:18:21.007 "unmap": true, 00:18:21.007 "flush": true, 00:18:21.007 "reset": true, 00:18:21.007 "nvme_admin": false, 00:18:21.007 "nvme_io": false, 00:18:21.007 "nvme_io_md": false, 00:18:21.007 "write_zeroes": true, 00:18:21.007 "zcopy": false, 00:18:21.007 "get_zone_info": false, 00:18:21.007 "zone_management": false, 00:18:21.007 "zone_append": false, 00:18:21.007 "compare": false, 00:18:21.007 "compare_and_write": false, 00:18:21.007 "abort": false, 00:18:21.007 "seek_hole": false, 00:18:21.007 "seek_data": false, 00:18:21.007 "copy": false, 00:18:21.007 "nvme_iov_md": false 00:18:21.007 }, 00:18:21.007 "memory_domains": [ 00:18:21.007 { 00:18:21.007 "dma_device_id": "system", 00:18:21.007 "dma_device_type": 1 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.007 "dma_device_type": 2 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "system", 00:18:21.007 "dma_device_type": 1 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.007 "dma_device_type": 2 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "system", 00:18:21.007 "dma_device_type": 1 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.007 "dma_device_type": 2 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "system", 00:18:21.007 "dma_device_type": 1 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.007 "dma_device_type": 2 00:18:21.007 } 00:18:21.007 ], 00:18:21.007 "driver_specific": { 00:18:21.007 "raid": { 00:18:21.007 "uuid": "7fc7bdef-e7eb-4756-8e91-744b785e0ec0", 00:18:21.007 "strip_size_kb": 64, 00:18:21.007 "state": "online", 00:18:21.007 "raid_level": "raid0", 00:18:21.007 "superblock": true, 00:18:21.007 "num_base_bdevs": 4, 00:18:21.007 "num_base_bdevs_discovered": 4, 00:18:21.007 "num_base_bdevs_operational": 4, 00:18:21.007 "base_bdevs_list": [ 00:18:21.007 { 00:18:21.007 "name": "pt1", 00:18:21.007 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:21.007 "is_configured": true, 00:18:21.007 "data_offset": 2048, 00:18:21.007 "data_size": 63488 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "name": "pt2", 00:18:21.007 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.007 "is_configured": true, 00:18:21.007 "data_offset": 2048, 00:18:21.007 "data_size": 63488 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "name": "pt3", 00:18:21.007 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:21.007 "is_configured": true, 00:18:21.007 "data_offset": 2048, 00:18:21.007 "data_size": 63488 00:18:21.007 }, 00:18:21.007 { 00:18:21.007 "name": "pt4", 00:18:21.007 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:21.007 "is_configured": true, 00:18:21.007 "data_offset": 2048, 00:18:21.007 "data_size": 63488 00:18:21.007 } 00:18:21.007 ] 00:18:21.007 } 00:18:21.007 } 00:18:21.007 }' 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:21.007 pt2 00:18:21.007 pt3 00:18:21.007 pt4' 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:21.007 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.265 "name": "pt1", 00:18:21.265 "aliases": [ 00:18:21.265 "00000000-0000-0000-0000-000000000001" 00:18:21.265 ], 00:18:21.265 "product_name": "passthru", 00:18:21.265 "block_size": 512, 00:18:21.265 "num_blocks": 65536, 00:18:21.265 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:21.265 "assigned_rate_limits": { 00:18:21.265 "rw_ios_per_sec": 0, 00:18:21.265 "rw_mbytes_per_sec": 0, 00:18:21.265 "r_mbytes_per_sec": 0, 00:18:21.265 "w_mbytes_per_sec": 0 00:18:21.265 }, 00:18:21.265 "claimed": true, 00:18:21.265 "claim_type": "exclusive_write", 00:18:21.265 "zoned": false, 00:18:21.265 "supported_io_types": { 00:18:21.265 "read": true, 00:18:21.265 "write": true, 00:18:21.265 "unmap": true, 00:18:21.265 "flush": true, 00:18:21.265 "reset": true, 00:18:21.265 "nvme_admin": false, 00:18:21.265 "nvme_io": false, 00:18:21.265 "nvme_io_md": false, 00:18:21.265 "write_zeroes": true, 00:18:21.265 "zcopy": true, 00:18:21.265 "get_zone_info": false, 00:18:21.265 "zone_management": false, 00:18:21.265 "zone_append": false, 00:18:21.265 "compare": false, 00:18:21.265 "compare_and_write": false, 00:18:21.265 "abort": true, 00:18:21.265 "seek_hole": false, 00:18:21.265 "seek_data": false, 00:18:21.265 "copy": true, 00:18:21.265 "nvme_iov_md": false 00:18:21.265 }, 00:18:21.265 "memory_domains": [ 00:18:21.265 { 00:18:21.265 "dma_device_id": "system", 00:18:21.265 "dma_device_type": 1 00:18:21.265 }, 00:18:21.265 { 00:18:21.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.265 "dma_device_type": 2 00:18:21.265 } 00:18:21.265 ], 00:18:21.265 "driver_specific": { 00:18:21.265 "passthru": { 00:18:21.265 "name": "pt1", 00:18:21.265 "base_bdev_name": "malloc1" 00:18:21.265 } 00:18:21.265 } 00:18:21.265 }' 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:21.265 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:21.524 13:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.782 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.782 "name": "pt2", 00:18:21.782 "aliases": [ 00:18:21.782 "00000000-0000-0000-0000-000000000002" 00:18:21.782 ], 00:18:21.782 "product_name": "passthru", 00:18:21.782 "block_size": 512, 00:18:21.782 "num_blocks": 65536, 00:18:21.782 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:21.782 "assigned_rate_limits": { 00:18:21.782 "rw_ios_per_sec": 0, 00:18:21.782 "rw_mbytes_per_sec": 0, 00:18:21.782 "r_mbytes_per_sec": 0, 00:18:21.782 "w_mbytes_per_sec": 0 00:18:21.782 }, 00:18:21.782 "claimed": true, 00:18:21.782 "claim_type": "exclusive_write", 00:18:21.782 "zoned": false, 00:18:21.782 "supported_io_types": { 00:18:21.782 "read": true, 00:18:21.782 "write": true, 00:18:21.782 "unmap": true, 00:18:21.782 "flush": true, 00:18:21.782 "reset": true, 00:18:21.782 "nvme_admin": false, 00:18:21.782 "nvme_io": false, 00:18:21.782 "nvme_io_md": false, 00:18:21.782 "write_zeroes": true, 00:18:21.782 "zcopy": true, 00:18:21.782 "get_zone_info": false, 00:18:21.782 "zone_management": false, 00:18:21.782 "zone_append": false, 00:18:21.782 "compare": false, 00:18:21.782 "compare_and_write": false, 00:18:21.782 "abort": true, 00:18:21.782 "seek_hole": false, 00:18:21.782 "seek_data": false, 00:18:21.782 "copy": true, 00:18:21.782 "nvme_iov_md": false 00:18:21.782 }, 00:18:21.782 "memory_domains": [ 00:18:21.782 { 00:18:21.782 "dma_device_id": "system", 00:18:21.782 "dma_device_type": 1 00:18:21.782 }, 00:18:21.782 { 00:18:21.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.782 "dma_device_type": 2 00:18:21.782 } 00:18:21.782 ], 00:18:21.782 "driver_specific": { 00:18:21.782 "passthru": { 00:18:21.782 "name": "pt2", 00:18:21.782 "base_bdev_name": "malloc2" 00:18:21.782 } 00:18:21.782 } 00:18:21.782 }' 00:18:21.782 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.782 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.782 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.782 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.040 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.298 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.298 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.298 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:22.298 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.556 "name": "pt3", 00:18:22.556 "aliases": [ 00:18:22.556 "00000000-0000-0000-0000-000000000003" 00:18:22.556 ], 00:18:22.556 "product_name": "passthru", 00:18:22.556 "block_size": 512, 00:18:22.556 "num_blocks": 65536, 00:18:22.556 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:22.556 "assigned_rate_limits": { 00:18:22.556 "rw_ios_per_sec": 0, 00:18:22.556 "rw_mbytes_per_sec": 0, 00:18:22.556 "r_mbytes_per_sec": 0, 00:18:22.556 "w_mbytes_per_sec": 0 00:18:22.556 }, 00:18:22.556 "claimed": true, 00:18:22.556 "claim_type": "exclusive_write", 00:18:22.556 "zoned": false, 00:18:22.556 "supported_io_types": { 00:18:22.556 "read": true, 00:18:22.556 "write": true, 00:18:22.556 "unmap": true, 00:18:22.556 "flush": true, 00:18:22.556 "reset": true, 00:18:22.556 "nvme_admin": false, 00:18:22.556 "nvme_io": false, 00:18:22.556 "nvme_io_md": false, 00:18:22.556 "write_zeroes": true, 00:18:22.556 "zcopy": true, 00:18:22.556 "get_zone_info": false, 00:18:22.556 "zone_management": false, 00:18:22.556 "zone_append": false, 00:18:22.556 "compare": false, 00:18:22.556 "compare_and_write": false, 00:18:22.556 "abort": true, 00:18:22.556 "seek_hole": false, 00:18:22.556 "seek_data": false, 00:18:22.556 "copy": true, 00:18:22.556 "nvme_iov_md": false 00:18:22.556 }, 00:18:22.556 "memory_domains": [ 00:18:22.556 { 00:18:22.556 "dma_device_id": "system", 00:18:22.556 "dma_device_type": 1 00:18:22.556 }, 00:18:22.556 { 00:18:22.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.556 "dma_device_type": 2 00:18:22.556 } 00:18:22.556 ], 00:18:22.556 "driver_specific": { 00:18:22.556 "passthru": { 00:18:22.556 "name": "pt3", 00:18:22.556 "base_bdev_name": "malloc3" 00:18:22.556 } 00:18:22.556 } 00:18:22.556 }' 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.556 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.814 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.814 13:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.814 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.814 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.814 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.814 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:22.814 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.071 "name": "pt4", 00:18:23.071 "aliases": [ 00:18:23.071 "00000000-0000-0000-0000-000000000004" 00:18:23.071 ], 00:18:23.071 "product_name": "passthru", 00:18:23.071 "block_size": 512, 00:18:23.071 "num_blocks": 65536, 00:18:23.071 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:23.071 "assigned_rate_limits": { 00:18:23.071 "rw_ios_per_sec": 0, 00:18:23.071 "rw_mbytes_per_sec": 0, 00:18:23.071 "r_mbytes_per_sec": 0, 00:18:23.071 "w_mbytes_per_sec": 0 00:18:23.071 }, 00:18:23.071 "claimed": true, 00:18:23.071 "claim_type": "exclusive_write", 00:18:23.071 "zoned": false, 00:18:23.071 "supported_io_types": { 00:18:23.071 "read": true, 00:18:23.071 "write": true, 00:18:23.071 "unmap": true, 00:18:23.071 "flush": true, 00:18:23.071 "reset": true, 00:18:23.071 "nvme_admin": false, 00:18:23.071 "nvme_io": false, 00:18:23.071 "nvme_io_md": false, 00:18:23.071 "write_zeroes": true, 00:18:23.071 "zcopy": true, 00:18:23.071 "get_zone_info": false, 00:18:23.071 "zone_management": false, 00:18:23.071 "zone_append": false, 00:18:23.071 "compare": false, 00:18:23.071 "compare_and_write": false, 00:18:23.071 "abort": true, 00:18:23.071 "seek_hole": false, 00:18:23.071 "seek_data": false, 00:18:23.071 "copy": true, 00:18:23.071 "nvme_iov_md": false 00:18:23.071 }, 00:18:23.071 "memory_domains": [ 00:18:23.071 { 00:18:23.071 "dma_device_id": "system", 00:18:23.071 "dma_device_type": 1 00:18:23.071 }, 00:18:23.071 { 00:18:23.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.071 "dma_device_type": 2 00:18:23.071 } 00:18:23.071 ], 00:18:23.071 "driver_specific": { 00:18:23.071 "passthru": { 00:18:23.071 "name": "pt4", 00:18:23.071 "base_bdev_name": "malloc4" 00:18:23.071 } 00:18:23.071 } 00:18:23.071 }' 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.071 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:23.330 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:23.588 [2024-07-15 13:37:02.842328] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7fc7bdef-e7eb-4756-8e91-744b785e0ec0 '!=' 7fc7bdef-e7eb-4756-8e91-744b785e0ec0 ']' 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2138107 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2138107 ']' 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2138107 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2138107 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2138107' 00:18:23.588 killing process with pid 2138107 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2138107 00:18:23.588 [2024-07-15 13:37:02.910406] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:23.588 13:37:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2138107 00:18:23.588 [2024-07-15 13:37:02.910474] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:23.588 [2024-07-15 13:37:02.910537] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:23.588 [2024-07-15 13:37:02.910548] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d9780 name raid_bdev1, state offline 00:18:23.588 [2024-07-15 13:37:02.950839] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:23.846 13:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:23.846 00:18:23.846 real 0m16.241s 00:18:23.846 user 0m29.340s 00:18:23.846 sys 0m2.885s 00:18:23.846 13:37:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:23.846 13:37:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.846 ************************************ 00:18:23.846 END TEST raid_superblock_test 00:18:23.846 ************************************ 00:18:23.846 13:37:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:23.846 13:37:03 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:23.846 13:37:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:23.846 13:37:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:23.846 13:37:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:23.846 ************************************ 00:18:23.846 START TEST raid_read_error_test 00:18:23.846 ************************************ 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Adm1hrwqTO 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2140547 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2140547 /var/tmp/spdk-raid.sock 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2140547 ']' 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:23.846 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.846 13:37:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:24.103 [2024-07-15 13:37:03.314623] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:18:24.103 [2024-07-15 13:37:03.314688] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2140547 ] 00:18:24.103 [2024-07-15 13:37:03.440913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:24.360 [2024-07-15 13:37:03.544259] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:24.360 [2024-07-15 13:37:03.605565] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:24.360 [2024-07-15 13:37:03.605603] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:25.292 13:37:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:25.292 13:37:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:25.292 13:37:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:25.292 13:37:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:25.550 BaseBdev1_malloc 00:18:25.550 13:37:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:26.116 true 00:18:26.116 13:37:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:26.394 [2024-07-15 13:37:05.756861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:26.394 [2024-07-15 13:37:05.756914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:26.394 [2024-07-15 13:37:05.756946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c4c0d0 00:18:26.394 [2024-07-15 13:37:05.756960] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:26.394 [2024-07-15 13:37:05.758834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:26.394 [2024-07-15 13:37:05.758862] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:26.394 BaseBdev1 00:18:26.394 13:37:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:26.394 13:37:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:26.665 BaseBdev2_malloc 00:18:26.665 13:37:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:27.230 true 00:18:27.230 13:37:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:27.487 [2024-07-15 13:37:06.777356] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:27.487 [2024-07-15 13:37:06.777401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.487 [2024-07-15 13:37:06.777424] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c50910 00:18:27.487 [2024-07-15 13:37:06.777437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.487 [2024-07-15 13:37:06.779036] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.487 [2024-07-15 13:37:06.779063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:27.487 BaseBdev2 00:18:27.487 13:37:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:27.487 13:37:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:27.743 BaseBdev3_malloc 00:18:27.743 13:37:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:28.001 true 00:18:28.001 13:37:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:28.259 [2024-07-15 13:37:07.503838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:28.259 [2024-07-15 13:37:07.503882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.259 [2024-07-15 13:37:07.503905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c52bd0 00:18:28.259 [2024-07-15 13:37:07.503918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.259 [2024-07-15 13:37:07.505479] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.259 [2024-07-15 13:37:07.505506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:28.259 BaseBdev3 00:18:28.259 13:37:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:28.259 13:37:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:28.516 BaseBdev4_malloc 00:18:28.516 13:37:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:28.774 true 00:18:28.774 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:29.339 [2024-07-15 13:37:08.484220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:29.339 [2024-07-15 13:37:08.484265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:29.339 [2024-07-15 13:37:08.484288] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c53aa0 00:18:29.339 [2024-07-15 13:37:08.484301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:29.339 [2024-07-15 13:37:08.485902] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:29.339 [2024-07-15 13:37:08.485938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:29.339 BaseBdev4 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:29.339 [2024-07-15 13:37:08.740940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:29.339 [2024-07-15 13:37:08.742321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:29.339 [2024-07-15 13:37:08.742392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:29.339 [2024-07-15 13:37:08.742453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:29.339 [2024-07-15 13:37:08.742698] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c4dc20 00:18:29.339 [2024-07-15 13:37:08.742709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:29.339 [2024-07-15 13:37:08.742909] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aa2260 00:18:29.339 [2024-07-15 13:37:08.743071] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c4dc20 00:18:29.339 [2024-07-15 13:37:08.743082] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c4dc20 00:18:29.339 [2024-07-15 13:37:08.743188] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.339 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.596 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.596 13:37:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.596 13:37:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.596 "name": "raid_bdev1", 00:18:29.596 "uuid": "e85913a9-9b7b-41d5-9c44-5a5932f5a14a", 00:18:29.596 "strip_size_kb": 64, 00:18:29.596 "state": "online", 00:18:29.596 "raid_level": "raid0", 00:18:29.596 "superblock": true, 00:18:29.596 "num_base_bdevs": 4, 00:18:29.596 "num_base_bdevs_discovered": 4, 00:18:29.596 "num_base_bdevs_operational": 4, 00:18:29.596 "base_bdevs_list": [ 00:18:29.596 { 00:18:29.596 "name": "BaseBdev1", 00:18:29.596 "uuid": "c199c4fc-6a38-5b3c-9109-5414098b8888", 00:18:29.596 "is_configured": true, 00:18:29.596 "data_offset": 2048, 00:18:29.596 "data_size": 63488 00:18:29.596 }, 00:18:29.596 { 00:18:29.596 "name": "BaseBdev2", 00:18:29.596 "uuid": "628a23ad-845d-5a96-97a0-f374d601d27a", 00:18:29.596 "is_configured": true, 00:18:29.596 "data_offset": 2048, 00:18:29.596 "data_size": 63488 00:18:29.596 }, 00:18:29.596 { 00:18:29.596 "name": "BaseBdev3", 00:18:29.596 "uuid": "8c301fb0-6916-5548-8bfa-8cedf6b01517", 00:18:29.596 "is_configured": true, 00:18:29.596 "data_offset": 2048, 00:18:29.596 "data_size": 63488 00:18:29.596 }, 00:18:29.596 { 00:18:29.596 "name": "BaseBdev4", 00:18:29.596 "uuid": "de64ae68-5b32-5447-a6e4-c2850c0b2d45", 00:18:29.596 "is_configured": true, 00:18:29.596 "data_offset": 2048, 00:18:29.596 "data_size": 63488 00:18:29.596 } 00:18:29.596 ] 00:18:29.596 }' 00:18:29.596 13:37:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.596 13:37:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.530 13:37:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:30.530 13:37:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:30.530 [2024-07-15 13:37:09.675681] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3ffc0 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:31.463 13:37:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.721 13:37:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.721 "name": "raid_bdev1", 00:18:31.721 "uuid": "e85913a9-9b7b-41d5-9c44-5a5932f5a14a", 00:18:31.721 "strip_size_kb": 64, 00:18:31.721 "state": "online", 00:18:31.721 "raid_level": "raid0", 00:18:31.721 "superblock": true, 00:18:31.721 "num_base_bdevs": 4, 00:18:31.721 "num_base_bdevs_discovered": 4, 00:18:31.721 "num_base_bdevs_operational": 4, 00:18:31.721 "base_bdevs_list": [ 00:18:31.721 { 00:18:31.721 "name": "BaseBdev1", 00:18:31.721 "uuid": "c199c4fc-6a38-5b3c-9109-5414098b8888", 00:18:31.721 "is_configured": true, 00:18:31.721 "data_offset": 2048, 00:18:31.721 "data_size": 63488 00:18:31.721 }, 00:18:31.721 { 00:18:31.721 "name": "BaseBdev2", 00:18:31.721 "uuid": "628a23ad-845d-5a96-97a0-f374d601d27a", 00:18:31.721 "is_configured": true, 00:18:31.721 "data_offset": 2048, 00:18:31.721 "data_size": 63488 00:18:31.721 }, 00:18:31.721 { 00:18:31.721 "name": "BaseBdev3", 00:18:31.721 "uuid": "8c301fb0-6916-5548-8bfa-8cedf6b01517", 00:18:31.721 "is_configured": true, 00:18:31.721 "data_offset": 2048, 00:18:31.721 "data_size": 63488 00:18:31.721 }, 00:18:31.721 { 00:18:31.721 "name": "BaseBdev4", 00:18:31.721 "uuid": "de64ae68-5b32-5447-a6e4-c2850c0b2d45", 00:18:31.721 "is_configured": true, 00:18:31.721 "data_offset": 2048, 00:18:31.721 "data_size": 63488 00:18:31.721 } 00:18:31.721 ] 00:18:31.721 }' 00:18:31.721 13:37:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.721 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.289 13:37:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:32.549 [2024-07-15 13:37:11.796525] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:32.549 [2024-07-15 13:37:11.796565] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:32.549 [2024-07-15 13:37:11.799712] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:32.549 [2024-07-15 13:37:11.799749] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.549 [2024-07-15 13:37:11.799790] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:32.549 [2024-07-15 13:37:11.799801] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c4dc20 name raid_bdev1, state offline 00:18:32.549 0 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2140547 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2140547 ']' 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2140547 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2140547 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2140547' 00:18:32.549 killing process with pid 2140547 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2140547 00:18:32.549 [2024-07-15 13:37:11.862149] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:32.549 13:37:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2140547 00:18:32.549 [2024-07-15 13:37:11.893121] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Adm1hrwqTO 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:18:32.809 00:18:32.809 real 0m8.874s 00:18:32.809 user 0m14.559s 00:18:32.809 sys 0m1.431s 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:32.809 13:37:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.809 ************************************ 00:18:32.809 END TEST raid_read_error_test 00:18:32.809 ************************************ 00:18:32.809 13:37:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:32.809 13:37:12 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:18:32.809 13:37:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:32.809 13:37:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:32.809 13:37:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:32.809 ************************************ 00:18:32.809 START TEST raid_write_error_test 00:18:32.809 ************************************ 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ekIurfcIOU 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2141788 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2141788 /var/tmp/spdk-raid.sock 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2141788 ']' 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:32.809 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:32.809 13:37:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.068 [2024-07-15 13:37:12.254656] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:18:33.068 [2024-07-15 13:37:12.254706] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2141788 ] 00:18:33.068 [2024-07-15 13:37:12.367686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:33.068 [2024-07-15 13:37:12.471401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:33.328 [2024-07-15 13:37:12.532126] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:33.328 [2024-07-15 13:37:12.532183] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:33.896 13:37:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:33.896 13:37:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:33.896 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:33.896 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:34.154 BaseBdev1_malloc 00:18:34.154 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:34.413 true 00:18:34.413 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:34.671 [2024-07-15 13:37:13.905434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:34.671 [2024-07-15 13:37:13.905478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:34.671 [2024-07-15 13:37:13.905500] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e80d0 00:18:34.671 [2024-07-15 13:37:13.905513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:34.671 [2024-07-15 13:37:13.907400] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:34.671 [2024-07-15 13:37:13.907429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:34.671 BaseBdev1 00:18:34.671 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:34.671 13:37:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:34.931 BaseBdev2_malloc 00:18:34.931 13:37:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:35.189 true 00:18:35.189 13:37:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:35.448 [2024-07-15 13:37:14.643976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:35.448 [2024-07-15 13:37:14.644019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.448 [2024-07-15 13:37:14.644040] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ec910 00:18:35.448 [2024-07-15 13:37:14.644052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.448 [2024-07-15 13:37:14.645597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.448 [2024-07-15 13:37:14.645624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:35.448 BaseBdev2 00:18:35.448 13:37:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:35.448 13:37:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:35.706 BaseBdev3_malloc 00:18:35.706 13:37:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:35.706 true 00:18:35.965 13:37:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:35.965 [2024-07-15 13:37:15.363708] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:35.965 [2024-07-15 13:37:15.363755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.965 [2024-07-15 13:37:15.363776] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26eebd0 00:18:35.965 [2024-07-15 13:37:15.363788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.965 [2024-07-15 13:37:15.365399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.965 [2024-07-15 13:37:15.365428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:35.965 BaseBdev3 00:18:35.965 13:37:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:35.965 13:37:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:36.224 BaseBdev4_malloc 00:18:36.224 13:37:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:36.492 true 00:18:36.492 13:37:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:36.750 [2024-07-15 13:37:16.079412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:36.750 [2024-07-15 13:37:16.079455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.750 [2024-07-15 13:37:16.079474] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26efaa0 00:18:36.750 [2024-07-15 13:37:16.079487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.750 [2024-07-15 13:37:16.081024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.751 [2024-07-15 13:37:16.081051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:36.751 BaseBdev4 00:18:36.751 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:37.009 [2024-07-15 13:37:16.324095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.009 [2024-07-15 13:37:16.325437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.009 [2024-07-15 13:37:16.325505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:37.009 [2024-07-15 13:37:16.325566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:37.009 [2024-07-15 13:37:16.325799] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26e9c20 00:18:37.009 [2024-07-15 13:37:16.325811] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:37.009 [2024-07-15 13:37:16.326020] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253e260 00:18:37.009 [2024-07-15 13:37:16.326176] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26e9c20 00:18:37.009 [2024-07-15 13:37:16.326186] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26e9c20 00:18:37.009 [2024-07-15 13:37:16.326290] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.009 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.267 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.267 "name": "raid_bdev1", 00:18:37.267 "uuid": "fbb03003-6fe3-4a06-a3a9-b820deacf5ee", 00:18:37.267 "strip_size_kb": 64, 00:18:37.267 "state": "online", 00:18:37.267 "raid_level": "raid0", 00:18:37.267 "superblock": true, 00:18:37.267 "num_base_bdevs": 4, 00:18:37.267 "num_base_bdevs_discovered": 4, 00:18:37.267 "num_base_bdevs_operational": 4, 00:18:37.267 "base_bdevs_list": [ 00:18:37.267 { 00:18:37.267 "name": "BaseBdev1", 00:18:37.267 "uuid": "7880cd26-e10d-5ba5-a529-b6726c9db458", 00:18:37.267 "is_configured": true, 00:18:37.267 "data_offset": 2048, 00:18:37.267 "data_size": 63488 00:18:37.267 }, 00:18:37.267 { 00:18:37.267 "name": "BaseBdev2", 00:18:37.267 "uuid": "002377b4-6692-516a-bf74-3af1a6f56b19", 00:18:37.267 "is_configured": true, 00:18:37.267 "data_offset": 2048, 00:18:37.267 "data_size": 63488 00:18:37.267 }, 00:18:37.267 { 00:18:37.267 "name": "BaseBdev3", 00:18:37.267 "uuid": "222a999d-cbd9-5b06-b67a-cd8839ec6e38", 00:18:37.267 "is_configured": true, 00:18:37.267 "data_offset": 2048, 00:18:37.267 "data_size": 63488 00:18:37.267 }, 00:18:37.267 { 00:18:37.267 "name": "BaseBdev4", 00:18:37.267 "uuid": "ca643441-3b03-5c07-a294-8838d4db046a", 00:18:37.267 "is_configured": true, 00:18:37.267 "data_offset": 2048, 00:18:37.267 "data_size": 63488 00:18:37.267 } 00:18:37.267 ] 00:18:37.267 }' 00:18:37.267 13:37:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.267 13:37:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.834 13:37:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:37.834 13:37:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:38.092 [2024-07-15 13:37:17.290937] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26dbfc0 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.029 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.288 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.288 "name": "raid_bdev1", 00:18:39.288 "uuid": "fbb03003-6fe3-4a06-a3a9-b820deacf5ee", 00:18:39.288 "strip_size_kb": 64, 00:18:39.288 "state": "online", 00:18:39.288 "raid_level": "raid0", 00:18:39.288 "superblock": true, 00:18:39.288 "num_base_bdevs": 4, 00:18:39.288 "num_base_bdevs_discovered": 4, 00:18:39.288 "num_base_bdevs_operational": 4, 00:18:39.288 "base_bdevs_list": [ 00:18:39.288 { 00:18:39.288 "name": "BaseBdev1", 00:18:39.288 "uuid": "7880cd26-e10d-5ba5-a529-b6726c9db458", 00:18:39.288 "is_configured": true, 00:18:39.288 "data_offset": 2048, 00:18:39.288 "data_size": 63488 00:18:39.288 }, 00:18:39.288 { 00:18:39.288 "name": "BaseBdev2", 00:18:39.288 "uuid": "002377b4-6692-516a-bf74-3af1a6f56b19", 00:18:39.288 "is_configured": true, 00:18:39.288 "data_offset": 2048, 00:18:39.288 "data_size": 63488 00:18:39.288 }, 00:18:39.288 { 00:18:39.288 "name": "BaseBdev3", 00:18:39.288 "uuid": "222a999d-cbd9-5b06-b67a-cd8839ec6e38", 00:18:39.288 "is_configured": true, 00:18:39.288 "data_offset": 2048, 00:18:39.288 "data_size": 63488 00:18:39.288 }, 00:18:39.288 { 00:18:39.288 "name": "BaseBdev4", 00:18:39.288 "uuid": "ca643441-3b03-5c07-a294-8838d4db046a", 00:18:39.288 "is_configured": true, 00:18:39.288 "data_offset": 2048, 00:18:39.288 "data_size": 63488 00:18:39.288 } 00:18:39.288 ] 00:18:39.288 }' 00:18:39.288 13:37:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.288 13:37:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:39.855 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:40.114 [2024-07-15 13:37:19.488380] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:40.114 [2024-07-15 13:37:19.488412] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.114 [2024-07-15 13:37:19.491644] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.114 [2024-07-15 13:37:19.491685] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.114 [2024-07-15 13:37:19.491727] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.114 [2024-07-15 13:37:19.491738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e9c20 name raid_bdev1, state offline 00:18:40.114 0 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2141788 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2141788 ']' 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2141788 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:40.114 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2141788 00:18:40.441 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:40.441 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:40.441 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2141788' 00:18:40.441 killing process with pid 2141788 00:18:40.441 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2141788 00:18:40.441 [2024-07-15 13:37:19.558821] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:40.441 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2141788 00:18:40.441 [2024-07-15 13:37:19.594800] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:40.700 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ekIurfcIOU 00:18:40.700 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:40.700 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:40.700 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:18:40.700 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:40.701 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:40.701 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:40.701 13:37:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:18:40.701 00:18:40.701 real 0m7.640s 00:18:40.701 user 0m12.234s 00:18:40.701 sys 0m1.317s 00:18:40.701 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:40.701 13:37:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.701 ************************************ 00:18:40.701 END TEST raid_write_error_test 00:18:40.701 ************************************ 00:18:40.701 13:37:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:40.701 13:37:19 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:40.701 13:37:19 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:18:40.701 13:37:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:40.701 13:37:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:40.701 13:37:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:40.701 ************************************ 00:18:40.701 START TEST raid_state_function_test 00:18:40.701 ************************************ 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2142853 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2142853' 00:18:40.701 Process raid pid: 2142853 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2142853 /var/tmp/spdk-raid.sock 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2142853 ']' 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:40.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:40.701 13:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.701 [2024-07-15 13:37:19.994696] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:18:40.701 [2024-07-15 13:37:19.994750] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:40.701 [2024-07-15 13:37:20.111262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:40.960 [2024-07-15 13:37:20.214185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:40.960 [2024-07-15 13:37:20.278232] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:40.960 [2024-07-15 13:37:20.278265] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.529 13:37:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:41.529 13:37:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:41.529 13:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:41.788 [2024-07-15 13:37:21.092842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:41.788 [2024-07-15 13:37:21.092885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:41.788 [2024-07-15 13:37:21.092895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:41.788 [2024-07-15 13:37:21.092907] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:41.788 [2024-07-15 13:37:21.092916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:41.788 [2024-07-15 13:37:21.092933] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:41.789 [2024-07-15 13:37:21.092942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:41.789 [2024-07-15 13:37:21.092953] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.789 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.048 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.048 "name": "Existed_Raid", 00:18:42.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.048 "strip_size_kb": 64, 00:18:42.048 "state": "configuring", 00:18:42.048 "raid_level": "concat", 00:18:42.048 "superblock": false, 00:18:42.048 "num_base_bdevs": 4, 00:18:42.048 "num_base_bdevs_discovered": 0, 00:18:42.048 "num_base_bdevs_operational": 4, 00:18:42.048 "base_bdevs_list": [ 00:18:42.048 { 00:18:42.048 "name": "BaseBdev1", 00:18:42.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.048 "is_configured": false, 00:18:42.048 "data_offset": 0, 00:18:42.048 "data_size": 0 00:18:42.048 }, 00:18:42.048 { 00:18:42.048 "name": "BaseBdev2", 00:18:42.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.048 "is_configured": false, 00:18:42.048 "data_offset": 0, 00:18:42.048 "data_size": 0 00:18:42.048 }, 00:18:42.048 { 00:18:42.048 "name": "BaseBdev3", 00:18:42.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.048 "is_configured": false, 00:18:42.048 "data_offset": 0, 00:18:42.048 "data_size": 0 00:18:42.048 }, 00:18:42.048 { 00:18:42.048 "name": "BaseBdev4", 00:18:42.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.048 "is_configured": false, 00:18:42.048 "data_offset": 0, 00:18:42.048 "data_size": 0 00:18:42.048 } 00:18:42.048 ] 00:18:42.048 }' 00:18:42.048 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.048 13:37:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.616 13:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:42.875 [2024-07-15 13:37:22.195615] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:42.875 [2024-07-15 13:37:22.195644] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f46aa0 name Existed_Raid, state configuring 00:18:42.875 13:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:43.134 [2024-07-15 13:37:22.444304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:43.134 [2024-07-15 13:37:22.444329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:43.134 [2024-07-15 13:37:22.444338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:43.134 [2024-07-15 13:37:22.444350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:43.134 [2024-07-15 13:37:22.444358] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:43.134 [2024-07-15 13:37:22.444370] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:43.134 [2024-07-15 13:37:22.444379] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:43.134 [2024-07-15 13:37:22.444389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:43.134 13:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:43.393 [2024-07-15 13:37:22.698967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.393 BaseBdev1 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:43.393 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.652 13:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:43.913 [ 00:18:43.913 { 00:18:43.913 "name": "BaseBdev1", 00:18:43.913 "aliases": [ 00:18:43.913 "0681b900-6db2-458f-8533-2f0c8bf389c2" 00:18:43.913 ], 00:18:43.913 "product_name": "Malloc disk", 00:18:43.913 "block_size": 512, 00:18:43.913 "num_blocks": 65536, 00:18:43.913 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:43.913 "assigned_rate_limits": { 00:18:43.913 "rw_ios_per_sec": 0, 00:18:43.913 "rw_mbytes_per_sec": 0, 00:18:43.913 "r_mbytes_per_sec": 0, 00:18:43.913 "w_mbytes_per_sec": 0 00:18:43.913 }, 00:18:43.913 "claimed": true, 00:18:43.913 "claim_type": "exclusive_write", 00:18:43.913 "zoned": false, 00:18:43.913 "supported_io_types": { 00:18:43.913 "read": true, 00:18:43.913 "write": true, 00:18:43.913 "unmap": true, 00:18:43.913 "flush": true, 00:18:43.913 "reset": true, 00:18:43.913 "nvme_admin": false, 00:18:43.913 "nvme_io": false, 00:18:43.913 "nvme_io_md": false, 00:18:43.913 "write_zeroes": true, 00:18:43.913 "zcopy": true, 00:18:43.913 "get_zone_info": false, 00:18:43.913 "zone_management": false, 00:18:43.913 "zone_append": false, 00:18:43.913 "compare": false, 00:18:43.913 "compare_and_write": false, 00:18:43.913 "abort": true, 00:18:43.913 "seek_hole": false, 00:18:43.913 "seek_data": false, 00:18:43.913 "copy": true, 00:18:43.913 "nvme_iov_md": false 00:18:43.913 }, 00:18:43.913 "memory_domains": [ 00:18:43.913 { 00:18:43.913 "dma_device_id": "system", 00:18:43.913 "dma_device_type": 1 00:18:43.913 }, 00:18:43.913 { 00:18:43.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.913 "dma_device_type": 2 00:18:43.913 } 00:18:43.913 ], 00:18:43.913 "driver_specific": {} 00:18:43.913 } 00:18:43.913 ] 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.913 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.170 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.170 "name": "Existed_Raid", 00:18:44.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.170 "strip_size_kb": 64, 00:18:44.170 "state": "configuring", 00:18:44.170 "raid_level": "concat", 00:18:44.170 "superblock": false, 00:18:44.170 "num_base_bdevs": 4, 00:18:44.170 "num_base_bdevs_discovered": 1, 00:18:44.170 "num_base_bdevs_operational": 4, 00:18:44.170 "base_bdevs_list": [ 00:18:44.170 { 00:18:44.170 "name": "BaseBdev1", 00:18:44.170 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:44.170 "is_configured": true, 00:18:44.170 "data_offset": 0, 00:18:44.170 "data_size": 65536 00:18:44.170 }, 00:18:44.170 { 00:18:44.170 "name": "BaseBdev2", 00:18:44.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.170 "is_configured": false, 00:18:44.170 "data_offset": 0, 00:18:44.170 "data_size": 0 00:18:44.170 }, 00:18:44.170 { 00:18:44.170 "name": "BaseBdev3", 00:18:44.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.170 "is_configured": false, 00:18:44.170 "data_offset": 0, 00:18:44.170 "data_size": 0 00:18:44.170 }, 00:18:44.170 { 00:18:44.170 "name": "BaseBdev4", 00:18:44.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.170 "is_configured": false, 00:18:44.170 "data_offset": 0, 00:18:44.170 "data_size": 0 00:18:44.170 } 00:18:44.170 ] 00:18:44.170 }' 00:18:44.170 13:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.170 13:37:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.734 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:44.990 [2024-07-15 13:37:24.239057] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:44.990 [2024-07-15 13:37:24.239097] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f46310 name Existed_Raid, state configuring 00:18:44.990 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:45.248 [2024-07-15 13:37:24.483741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:45.248 [2024-07-15 13:37:24.485202] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:45.248 [2024-07-15 13:37:24.485235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:45.248 [2024-07-15 13:37:24.485246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:45.248 [2024-07-15 13:37:24.485257] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:45.248 [2024-07-15 13:37:24.485266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:45.248 [2024-07-15 13:37:24.485277] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.248 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.506 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.506 "name": "Existed_Raid", 00:18:45.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.506 "strip_size_kb": 64, 00:18:45.506 "state": "configuring", 00:18:45.506 "raid_level": "concat", 00:18:45.506 "superblock": false, 00:18:45.506 "num_base_bdevs": 4, 00:18:45.506 "num_base_bdevs_discovered": 1, 00:18:45.506 "num_base_bdevs_operational": 4, 00:18:45.506 "base_bdevs_list": [ 00:18:45.506 { 00:18:45.506 "name": "BaseBdev1", 00:18:45.506 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:45.506 "is_configured": true, 00:18:45.506 "data_offset": 0, 00:18:45.506 "data_size": 65536 00:18:45.506 }, 00:18:45.506 { 00:18:45.506 "name": "BaseBdev2", 00:18:45.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.506 "is_configured": false, 00:18:45.506 "data_offset": 0, 00:18:45.506 "data_size": 0 00:18:45.506 }, 00:18:45.506 { 00:18:45.506 "name": "BaseBdev3", 00:18:45.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.506 "is_configured": false, 00:18:45.506 "data_offset": 0, 00:18:45.506 "data_size": 0 00:18:45.506 }, 00:18:45.506 { 00:18:45.506 "name": "BaseBdev4", 00:18:45.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.506 "is_configured": false, 00:18:45.506 "data_offset": 0, 00:18:45.506 "data_size": 0 00:18:45.506 } 00:18:45.506 ] 00:18:45.506 }' 00:18:45.506 13:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.506 13:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.071 13:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:46.329 [2024-07-15 13:37:25.602377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:46.329 BaseBdev2 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:46.329 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:46.587 13:37:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:46.844 [ 00:18:46.844 { 00:18:46.844 "name": "BaseBdev2", 00:18:46.844 "aliases": [ 00:18:46.844 "439b3ea0-a70a-4b34-b15e-208df14394f7" 00:18:46.844 ], 00:18:46.844 "product_name": "Malloc disk", 00:18:46.844 "block_size": 512, 00:18:46.844 "num_blocks": 65536, 00:18:46.844 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:46.844 "assigned_rate_limits": { 00:18:46.844 "rw_ios_per_sec": 0, 00:18:46.844 "rw_mbytes_per_sec": 0, 00:18:46.844 "r_mbytes_per_sec": 0, 00:18:46.844 "w_mbytes_per_sec": 0 00:18:46.844 }, 00:18:46.844 "claimed": true, 00:18:46.844 "claim_type": "exclusive_write", 00:18:46.844 "zoned": false, 00:18:46.844 "supported_io_types": { 00:18:46.844 "read": true, 00:18:46.844 "write": true, 00:18:46.844 "unmap": true, 00:18:46.844 "flush": true, 00:18:46.844 "reset": true, 00:18:46.844 "nvme_admin": false, 00:18:46.844 "nvme_io": false, 00:18:46.844 "nvme_io_md": false, 00:18:46.844 "write_zeroes": true, 00:18:46.844 "zcopy": true, 00:18:46.844 "get_zone_info": false, 00:18:46.844 "zone_management": false, 00:18:46.844 "zone_append": false, 00:18:46.844 "compare": false, 00:18:46.844 "compare_and_write": false, 00:18:46.844 "abort": true, 00:18:46.844 "seek_hole": false, 00:18:46.844 "seek_data": false, 00:18:46.844 "copy": true, 00:18:46.844 "nvme_iov_md": false 00:18:46.844 }, 00:18:46.844 "memory_domains": [ 00:18:46.844 { 00:18:46.844 "dma_device_id": "system", 00:18:46.844 "dma_device_type": 1 00:18:46.844 }, 00:18:46.844 { 00:18:46.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.844 "dma_device_type": 2 00:18:46.844 } 00:18:46.844 ], 00:18:46.844 "driver_specific": {} 00:18:46.844 } 00:18:46.844 ] 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.844 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.101 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.101 "name": "Existed_Raid", 00:18:47.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.101 "strip_size_kb": 64, 00:18:47.101 "state": "configuring", 00:18:47.101 "raid_level": "concat", 00:18:47.101 "superblock": false, 00:18:47.101 "num_base_bdevs": 4, 00:18:47.101 "num_base_bdevs_discovered": 2, 00:18:47.101 "num_base_bdevs_operational": 4, 00:18:47.101 "base_bdevs_list": [ 00:18:47.101 { 00:18:47.101 "name": "BaseBdev1", 00:18:47.101 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:47.101 "is_configured": true, 00:18:47.101 "data_offset": 0, 00:18:47.101 "data_size": 65536 00:18:47.101 }, 00:18:47.101 { 00:18:47.101 "name": "BaseBdev2", 00:18:47.101 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:47.101 "is_configured": true, 00:18:47.101 "data_offset": 0, 00:18:47.101 "data_size": 65536 00:18:47.101 }, 00:18:47.101 { 00:18:47.101 "name": "BaseBdev3", 00:18:47.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.101 "is_configured": false, 00:18:47.101 "data_offset": 0, 00:18:47.101 "data_size": 0 00:18:47.101 }, 00:18:47.101 { 00:18:47.101 "name": "BaseBdev4", 00:18:47.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.101 "is_configured": false, 00:18:47.101 "data_offset": 0, 00:18:47.101 "data_size": 0 00:18:47.101 } 00:18:47.101 ] 00:18:47.101 }' 00:18:47.101 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.101 13:37:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.666 13:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:47.923 [2024-07-15 13:37:27.161942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:47.923 BaseBdev3 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:47.923 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:48.181 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:48.438 [ 00:18:48.438 { 00:18:48.438 "name": "BaseBdev3", 00:18:48.438 "aliases": [ 00:18:48.438 "b6a67861-e6bf-4953-9f90-21250b1acddd" 00:18:48.438 ], 00:18:48.438 "product_name": "Malloc disk", 00:18:48.438 "block_size": 512, 00:18:48.438 "num_blocks": 65536, 00:18:48.438 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:48.438 "assigned_rate_limits": { 00:18:48.438 "rw_ios_per_sec": 0, 00:18:48.438 "rw_mbytes_per_sec": 0, 00:18:48.438 "r_mbytes_per_sec": 0, 00:18:48.438 "w_mbytes_per_sec": 0 00:18:48.438 }, 00:18:48.438 "claimed": true, 00:18:48.438 "claim_type": "exclusive_write", 00:18:48.438 "zoned": false, 00:18:48.438 "supported_io_types": { 00:18:48.438 "read": true, 00:18:48.438 "write": true, 00:18:48.438 "unmap": true, 00:18:48.438 "flush": true, 00:18:48.438 "reset": true, 00:18:48.438 "nvme_admin": false, 00:18:48.438 "nvme_io": false, 00:18:48.438 "nvme_io_md": false, 00:18:48.438 "write_zeroes": true, 00:18:48.438 "zcopy": true, 00:18:48.438 "get_zone_info": false, 00:18:48.438 "zone_management": false, 00:18:48.438 "zone_append": false, 00:18:48.438 "compare": false, 00:18:48.438 "compare_and_write": false, 00:18:48.438 "abort": true, 00:18:48.438 "seek_hole": false, 00:18:48.438 "seek_data": false, 00:18:48.438 "copy": true, 00:18:48.438 "nvme_iov_md": false 00:18:48.438 }, 00:18:48.438 "memory_domains": [ 00:18:48.438 { 00:18:48.438 "dma_device_id": "system", 00:18:48.438 "dma_device_type": 1 00:18:48.438 }, 00:18:48.438 { 00:18:48.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.438 "dma_device_type": 2 00:18:48.438 } 00:18:48.438 ], 00:18:48.438 "driver_specific": {} 00:18:48.438 } 00:18:48.438 ] 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.438 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.696 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.696 "name": "Existed_Raid", 00:18:48.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.696 "strip_size_kb": 64, 00:18:48.696 "state": "configuring", 00:18:48.696 "raid_level": "concat", 00:18:48.696 "superblock": false, 00:18:48.696 "num_base_bdevs": 4, 00:18:48.696 "num_base_bdevs_discovered": 3, 00:18:48.696 "num_base_bdevs_operational": 4, 00:18:48.696 "base_bdevs_list": [ 00:18:48.696 { 00:18:48.696 "name": "BaseBdev1", 00:18:48.696 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:48.696 "is_configured": true, 00:18:48.696 "data_offset": 0, 00:18:48.696 "data_size": 65536 00:18:48.696 }, 00:18:48.696 { 00:18:48.696 "name": "BaseBdev2", 00:18:48.696 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:48.696 "is_configured": true, 00:18:48.696 "data_offset": 0, 00:18:48.696 "data_size": 65536 00:18:48.696 }, 00:18:48.696 { 00:18:48.696 "name": "BaseBdev3", 00:18:48.696 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:48.696 "is_configured": true, 00:18:48.696 "data_offset": 0, 00:18:48.696 "data_size": 65536 00:18:48.696 }, 00:18:48.696 { 00:18:48.696 "name": "BaseBdev4", 00:18:48.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.696 "is_configured": false, 00:18:48.696 "data_offset": 0, 00:18:48.696 "data_size": 0 00:18:48.696 } 00:18:48.696 ] 00:18:48.696 }' 00:18:48.696 13:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.696 13:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.261 13:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:49.518 [2024-07-15 13:37:28.725450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:49.518 [2024-07-15 13:37:28.725497] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f47350 00:18:49.518 [2024-07-15 13:37:28.725506] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:49.518 [2024-07-15 13:37:28.725757] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f47020 00:18:49.518 [2024-07-15 13:37:28.725884] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f47350 00:18:49.518 [2024-07-15 13:37:28.725894] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f47350 00:18:49.518 [2024-07-15 13:37:28.726065] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:49.518 BaseBdev4 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:49.518 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:49.775 13:37:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:50.033 [ 00:18:50.033 { 00:18:50.033 "name": "BaseBdev4", 00:18:50.033 "aliases": [ 00:18:50.033 "463842fd-fc0b-401a-85ab-f8883df6b637" 00:18:50.033 ], 00:18:50.033 "product_name": "Malloc disk", 00:18:50.033 "block_size": 512, 00:18:50.033 "num_blocks": 65536, 00:18:50.033 "uuid": "463842fd-fc0b-401a-85ab-f8883df6b637", 00:18:50.033 "assigned_rate_limits": { 00:18:50.033 "rw_ios_per_sec": 0, 00:18:50.033 "rw_mbytes_per_sec": 0, 00:18:50.033 "r_mbytes_per_sec": 0, 00:18:50.033 "w_mbytes_per_sec": 0 00:18:50.033 }, 00:18:50.033 "claimed": true, 00:18:50.033 "claim_type": "exclusive_write", 00:18:50.033 "zoned": false, 00:18:50.033 "supported_io_types": { 00:18:50.033 "read": true, 00:18:50.033 "write": true, 00:18:50.033 "unmap": true, 00:18:50.033 "flush": true, 00:18:50.033 "reset": true, 00:18:50.033 "nvme_admin": false, 00:18:50.033 "nvme_io": false, 00:18:50.033 "nvme_io_md": false, 00:18:50.033 "write_zeroes": true, 00:18:50.033 "zcopy": true, 00:18:50.033 "get_zone_info": false, 00:18:50.033 "zone_management": false, 00:18:50.033 "zone_append": false, 00:18:50.033 "compare": false, 00:18:50.033 "compare_and_write": false, 00:18:50.033 "abort": true, 00:18:50.033 "seek_hole": false, 00:18:50.033 "seek_data": false, 00:18:50.033 "copy": true, 00:18:50.033 "nvme_iov_md": false 00:18:50.033 }, 00:18:50.033 "memory_domains": [ 00:18:50.033 { 00:18:50.033 "dma_device_id": "system", 00:18:50.033 "dma_device_type": 1 00:18:50.033 }, 00:18:50.033 { 00:18:50.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.033 "dma_device_type": 2 00:18:50.033 } 00:18:50.033 ], 00:18:50.033 "driver_specific": {} 00:18:50.033 } 00:18:50.033 ] 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.033 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.290 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.290 "name": "Existed_Raid", 00:18:50.290 "uuid": "5c8c685d-be4c-46be-b349-b45a79621365", 00:18:50.291 "strip_size_kb": 64, 00:18:50.291 "state": "online", 00:18:50.291 "raid_level": "concat", 00:18:50.291 "superblock": false, 00:18:50.291 "num_base_bdevs": 4, 00:18:50.291 "num_base_bdevs_discovered": 4, 00:18:50.291 "num_base_bdevs_operational": 4, 00:18:50.291 "base_bdevs_list": [ 00:18:50.291 { 00:18:50.291 "name": "BaseBdev1", 00:18:50.291 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:50.291 "is_configured": true, 00:18:50.291 "data_offset": 0, 00:18:50.291 "data_size": 65536 00:18:50.291 }, 00:18:50.291 { 00:18:50.291 "name": "BaseBdev2", 00:18:50.291 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:50.291 "is_configured": true, 00:18:50.291 "data_offset": 0, 00:18:50.291 "data_size": 65536 00:18:50.291 }, 00:18:50.291 { 00:18:50.291 "name": "BaseBdev3", 00:18:50.291 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:50.291 "is_configured": true, 00:18:50.291 "data_offset": 0, 00:18:50.291 "data_size": 65536 00:18:50.291 }, 00:18:50.291 { 00:18:50.291 "name": "BaseBdev4", 00:18:50.291 "uuid": "463842fd-fc0b-401a-85ab-f8883df6b637", 00:18:50.291 "is_configured": true, 00:18:50.291 "data_offset": 0, 00:18:50.291 "data_size": 65536 00:18:50.291 } 00:18:50.291 ] 00:18:50.291 }' 00:18:50.291 13:37:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.291 13:37:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:50.855 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:51.113 [2024-07-15 13:37:30.310008] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:51.113 "name": "Existed_Raid", 00:18:51.113 "aliases": [ 00:18:51.113 "5c8c685d-be4c-46be-b349-b45a79621365" 00:18:51.113 ], 00:18:51.113 "product_name": "Raid Volume", 00:18:51.113 "block_size": 512, 00:18:51.113 "num_blocks": 262144, 00:18:51.113 "uuid": "5c8c685d-be4c-46be-b349-b45a79621365", 00:18:51.113 "assigned_rate_limits": { 00:18:51.113 "rw_ios_per_sec": 0, 00:18:51.113 "rw_mbytes_per_sec": 0, 00:18:51.113 "r_mbytes_per_sec": 0, 00:18:51.113 "w_mbytes_per_sec": 0 00:18:51.113 }, 00:18:51.113 "claimed": false, 00:18:51.113 "zoned": false, 00:18:51.113 "supported_io_types": { 00:18:51.113 "read": true, 00:18:51.113 "write": true, 00:18:51.113 "unmap": true, 00:18:51.113 "flush": true, 00:18:51.113 "reset": true, 00:18:51.113 "nvme_admin": false, 00:18:51.113 "nvme_io": false, 00:18:51.113 "nvme_io_md": false, 00:18:51.113 "write_zeroes": true, 00:18:51.113 "zcopy": false, 00:18:51.113 "get_zone_info": false, 00:18:51.113 "zone_management": false, 00:18:51.113 "zone_append": false, 00:18:51.113 "compare": false, 00:18:51.113 "compare_and_write": false, 00:18:51.113 "abort": false, 00:18:51.113 "seek_hole": false, 00:18:51.113 "seek_data": false, 00:18:51.113 "copy": false, 00:18:51.113 "nvme_iov_md": false 00:18:51.113 }, 00:18:51.113 "memory_domains": [ 00:18:51.113 { 00:18:51.113 "dma_device_id": "system", 00:18:51.113 "dma_device_type": 1 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.113 "dma_device_type": 2 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "system", 00:18:51.113 "dma_device_type": 1 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.113 "dma_device_type": 2 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "system", 00:18:51.113 "dma_device_type": 1 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.113 "dma_device_type": 2 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "system", 00:18:51.113 "dma_device_type": 1 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.113 "dma_device_type": 2 00:18:51.113 } 00:18:51.113 ], 00:18:51.113 "driver_specific": { 00:18:51.113 "raid": { 00:18:51.113 "uuid": "5c8c685d-be4c-46be-b349-b45a79621365", 00:18:51.113 "strip_size_kb": 64, 00:18:51.113 "state": "online", 00:18:51.113 "raid_level": "concat", 00:18:51.113 "superblock": false, 00:18:51.113 "num_base_bdevs": 4, 00:18:51.113 "num_base_bdevs_discovered": 4, 00:18:51.113 "num_base_bdevs_operational": 4, 00:18:51.113 "base_bdevs_list": [ 00:18:51.113 { 00:18:51.113 "name": "BaseBdev1", 00:18:51.113 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:51.113 "is_configured": true, 00:18:51.113 "data_offset": 0, 00:18:51.113 "data_size": 65536 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "name": "BaseBdev2", 00:18:51.113 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:51.113 "is_configured": true, 00:18:51.113 "data_offset": 0, 00:18:51.113 "data_size": 65536 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "name": "BaseBdev3", 00:18:51.113 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:51.113 "is_configured": true, 00:18:51.113 "data_offset": 0, 00:18:51.113 "data_size": 65536 00:18:51.113 }, 00:18:51.113 { 00:18:51.113 "name": "BaseBdev4", 00:18:51.113 "uuid": "463842fd-fc0b-401a-85ab-f8883df6b637", 00:18:51.113 "is_configured": true, 00:18:51.113 "data_offset": 0, 00:18:51.113 "data_size": 65536 00:18:51.113 } 00:18:51.113 ] 00:18:51.113 } 00:18:51.113 } 00:18:51.113 }' 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:51.113 BaseBdev2 00:18:51.113 BaseBdev3 00:18:51.113 BaseBdev4' 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:51.113 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:51.370 "name": "BaseBdev1", 00:18:51.370 "aliases": [ 00:18:51.370 "0681b900-6db2-458f-8533-2f0c8bf389c2" 00:18:51.370 ], 00:18:51.370 "product_name": "Malloc disk", 00:18:51.370 "block_size": 512, 00:18:51.370 "num_blocks": 65536, 00:18:51.370 "uuid": "0681b900-6db2-458f-8533-2f0c8bf389c2", 00:18:51.370 "assigned_rate_limits": { 00:18:51.370 "rw_ios_per_sec": 0, 00:18:51.370 "rw_mbytes_per_sec": 0, 00:18:51.370 "r_mbytes_per_sec": 0, 00:18:51.370 "w_mbytes_per_sec": 0 00:18:51.370 }, 00:18:51.370 "claimed": true, 00:18:51.370 "claim_type": "exclusive_write", 00:18:51.370 "zoned": false, 00:18:51.370 "supported_io_types": { 00:18:51.370 "read": true, 00:18:51.370 "write": true, 00:18:51.370 "unmap": true, 00:18:51.370 "flush": true, 00:18:51.370 "reset": true, 00:18:51.370 "nvme_admin": false, 00:18:51.370 "nvme_io": false, 00:18:51.370 "nvme_io_md": false, 00:18:51.370 "write_zeroes": true, 00:18:51.370 "zcopy": true, 00:18:51.370 "get_zone_info": false, 00:18:51.370 "zone_management": false, 00:18:51.370 "zone_append": false, 00:18:51.370 "compare": false, 00:18:51.370 "compare_and_write": false, 00:18:51.370 "abort": true, 00:18:51.370 "seek_hole": false, 00:18:51.370 "seek_data": false, 00:18:51.370 "copy": true, 00:18:51.370 "nvme_iov_md": false 00:18:51.370 }, 00:18:51.370 "memory_domains": [ 00:18:51.370 { 00:18:51.370 "dma_device_id": "system", 00:18:51.370 "dma_device_type": 1 00:18:51.370 }, 00:18:51.370 { 00:18:51.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.370 "dma_device_type": 2 00:18:51.370 } 00:18:51.370 ], 00:18:51.370 "driver_specific": {} 00:18:51.370 }' 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.370 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:51.627 13:37:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:51.883 "name": "BaseBdev2", 00:18:51.883 "aliases": [ 00:18:51.883 "439b3ea0-a70a-4b34-b15e-208df14394f7" 00:18:51.883 ], 00:18:51.883 "product_name": "Malloc disk", 00:18:51.883 "block_size": 512, 00:18:51.883 "num_blocks": 65536, 00:18:51.883 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:51.883 "assigned_rate_limits": { 00:18:51.883 "rw_ios_per_sec": 0, 00:18:51.883 "rw_mbytes_per_sec": 0, 00:18:51.883 "r_mbytes_per_sec": 0, 00:18:51.883 "w_mbytes_per_sec": 0 00:18:51.883 }, 00:18:51.883 "claimed": true, 00:18:51.883 "claim_type": "exclusive_write", 00:18:51.883 "zoned": false, 00:18:51.883 "supported_io_types": { 00:18:51.883 "read": true, 00:18:51.883 "write": true, 00:18:51.883 "unmap": true, 00:18:51.883 "flush": true, 00:18:51.883 "reset": true, 00:18:51.883 "nvme_admin": false, 00:18:51.883 "nvme_io": false, 00:18:51.883 "nvme_io_md": false, 00:18:51.883 "write_zeroes": true, 00:18:51.883 "zcopy": true, 00:18:51.883 "get_zone_info": false, 00:18:51.883 "zone_management": false, 00:18:51.883 "zone_append": false, 00:18:51.883 "compare": false, 00:18:51.883 "compare_and_write": false, 00:18:51.883 "abort": true, 00:18:51.883 "seek_hole": false, 00:18:51.883 "seek_data": false, 00:18:51.883 "copy": true, 00:18:51.883 "nvme_iov_md": false 00:18:51.883 }, 00:18:51.883 "memory_domains": [ 00:18:51.883 { 00:18:51.883 "dma_device_id": "system", 00:18:51.883 "dma_device_type": 1 00:18:51.883 }, 00:18:51.883 { 00:18:51.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.883 "dma_device_type": 2 00:18:51.883 } 00:18:51.883 ], 00:18:51.883 "driver_specific": {} 00:18:51.883 }' 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.883 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:52.140 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.703 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.703 "name": "BaseBdev3", 00:18:52.703 "aliases": [ 00:18:52.703 "b6a67861-e6bf-4953-9f90-21250b1acddd" 00:18:52.703 ], 00:18:52.703 "product_name": "Malloc disk", 00:18:52.703 "block_size": 512, 00:18:52.703 "num_blocks": 65536, 00:18:52.703 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:52.703 "assigned_rate_limits": { 00:18:52.703 "rw_ios_per_sec": 0, 00:18:52.703 "rw_mbytes_per_sec": 0, 00:18:52.703 "r_mbytes_per_sec": 0, 00:18:52.703 "w_mbytes_per_sec": 0 00:18:52.703 }, 00:18:52.703 "claimed": true, 00:18:52.703 "claim_type": "exclusive_write", 00:18:52.703 "zoned": false, 00:18:52.703 "supported_io_types": { 00:18:52.703 "read": true, 00:18:52.703 "write": true, 00:18:52.703 "unmap": true, 00:18:52.703 "flush": true, 00:18:52.703 "reset": true, 00:18:52.703 "nvme_admin": false, 00:18:52.703 "nvme_io": false, 00:18:52.703 "nvme_io_md": false, 00:18:52.703 "write_zeroes": true, 00:18:52.703 "zcopy": true, 00:18:52.703 "get_zone_info": false, 00:18:52.703 "zone_management": false, 00:18:52.703 "zone_append": false, 00:18:52.703 "compare": false, 00:18:52.703 "compare_and_write": false, 00:18:52.703 "abort": true, 00:18:52.703 "seek_hole": false, 00:18:52.703 "seek_data": false, 00:18:52.703 "copy": true, 00:18:52.703 "nvme_iov_md": false 00:18:52.703 }, 00:18:52.703 "memory_domains": [ 00:18:52.703 { 00:18:52.704 "dma_device_id": "system", 00:18:52.704 "dma_device_type": 1 00:18:52.704 }, 00:18:52.704 { 00:18:52.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.704 "dma_device_type": 2 00:18:52.704 } 00:18:52.704 ], 00:18:52.704 "driver_specific": {} 00:18:52.704 }' 00:18:52.704 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.704 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.704 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.704 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:52.961 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.218 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.218 "name": "BaseBdev4", 00:18:53.218 "aliases": [ 00:18:53.218 "463842fd-fc0b-401a-85ab-f8883df6b637" 00:18:53.218 ], 00:18:53.218 "product_name": "Malloc disk", 00:18:53.218 "block_size": 512, 00:18:53.218 "num_blocks": 65536, 00:18:53.218 "uuid": "463842fd-fc0b-401a-85ab-f8883df6b637", 00:18:53.218 "assigned_rate_limits": { 00:18:53.218 "rw_ios_per_sec": 0, 00:18:53.218 "rw_mbytes_per_sec": 0, 00:18:53.218 "r_mbytes_per_sec": 0, 00:18:53.218 "w_mbytes_per_sec": 0 00:18:53.218 }, 00:18:53.218 "claimed": true, 00:18:53.218 "claim_type": "exclusive_write", 00:18:53.218 "zoned": false, 00:18:53.218 "supported_io_types": { 00:18:53.218 "read": true, 00:18:53.218 "write": true, 00:18:53.218 "unmap": true, 00:18:53.218 "flush": true, 00:18:53.218 "reset": true, 00:18:53.218 "nvme_admin": false, 00:18:53.218 "nvme_io": false, 00:18:53.218 "nvme_io_md": false, 00:18:53.218 "write_zeroes": true, 00:18:53.218 "zcopy": true, 00:18:53.218 "get_zone_info": false, 00:18:53.218 "zone_management": false, 00:18:53.218 "zone_append": false, 00:18:53.218 "compare": false, 00:18:53.218 "compare_and_write": false, 00:18:53.218 "abort": true, 00:18:53.218 "seek_hole": false, 00:18:53.218 "seek_data": false, 00:18:53.218 "copy": true, 00:18:53.218 "nvme_iov_md": false 00:18:53.218 }, 00:18:53.218 "memory_domains": [ 00:18:53.218 { 00:18:53.218 "dma_device_id": "system", 00:18:53.218 "dma_device_type": 1 00:18:53.218 }, 00:18:53.218 { 00:18:53.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.218 "dma_device_type": 2 00:18:53.218 } 00:18:53.218 ], 00:18:53.218 "driver_specific": {} 00:18:53.218 }' 00:18:53.218 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.218 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.474 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.730 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.730 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.730 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:54.302 [2024-07-15 13:37:33.470110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:54.302 [2024-07-15 13:37:33.470141] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:54.302 [2024-07-15 13:37:33.470191] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.302 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.602 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.602 "name": "Existed_Raid", 00:18:54.602 "uuid": "5c8c685d-be4c-46be-b349-b45a79621365", 00:18:54.602 "strip_size_kb": 64, 00:18:54.602 "state": "offline", 00:18:54.602 "raid_level": "concat", 00:18:54.602 "superblock": false, 00:18:54.602 "num_base_bdevs": 4, 00:18:54.602 "num_base_bdevs_discovered": 3, 00:18:54.602 "num_base_bdevs_operational": 3, 00:18:54.602 "base_bdevs_list": [ 00:18:54.602 { 00:18:54.602 "name": null, 00:18:54.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.602 "is_configured": false, 00:18:54.602 "data_offset": 0, 00:18:54.602 "data_size": 65536 00:18:54.602 }, 00:18:54.602 { 00:18:54.602 "name": "BaseBdev2", 00:18:54.602 "uuid": "439b3ea0-a70a-4b34-b15e-208df14394f7", 00:18:54.602 "is_configured": true, 00:18:54.602 "data_offset": 0, 00:18:54.602 "data_size": 65536 00:18:54.602 }, 00:18:54.602 { 00:18:54.602 "name": "BaseBdev3", 00:18:54.603 "uuid": "b6a67861-e6bf-4953-9f90-21250b1acddd", 00:18:54.603 "is_configured": true, 00:18:54.603 "data_offset": 0, 00:18:54.603 "data_size": 65536 00:18:54.603 }, 00:18:54.603 { 00:18:54.603 "name": "BaseBdev4", 00:18:54.603 "uuid": "463842fd-fc0b-401a-85ab-f8883df6b637", 00:18:54.603 "is_configured": true, 00:18:54.603 "data_offset": 0, 00:18:54.603 "data_size": 65536 00:18:54.603 } 00:18:54.603 ] 00:18:54.603 }' 00:18:54.603 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.603 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:55.536 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:55.794 [2024-07-15 13:37:35.068405] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:55.794 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:55.794 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:55.794 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.794 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:56.077 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:56.077 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:56.077 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:56.334 [2024-07-15 13:37:35.565958] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:56.334 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:56.334 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:56.334 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.334 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:56.592 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:56.592 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:56.592 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:56.592 [2024-07-15 13:37:35.983508] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:56.592 [2024-07-15 13:37:35.983554] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f47350 name Existed_Raid, state offline 00:18:56.592 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:56.592 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:56.592 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.592 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:56.849 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:57.105 BaseBdev2 00:18:57.105 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:57.105 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:57.105 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:57.106 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:57.106 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:57.106 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:57.106 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.363 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:57.935 [ 00:18:57.935 { 00:18:57.935 "name": "BaseBdev2", 00:18:57.935 "aliases": [ 00:18:57.935 "071b8d6c-39e5-47f7-9c0d-1ffb525f5061" 00:18:57.935 ], 00:18:57.935 "product_name": "Malloc disk", 00:18:57.935 "block_size": 512, 00:18:57.935 "num_blocks": 65536, 00:18:57.935 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:18:57.935 "assigned_rate_limits": { 00:18:57.935 "rw_ios_per_sec": 0, 00:18:57.935 "rw_mbytes_per_sec": 0, 00:18:57.935 "r_mbytes_per_sec": 0, 00:18:57.935 "w_mbytes_per_sec": 0 00:18:57.935 }, 00:18:57.935 "claimed": false, 00:18:57.935 "zoned": false, 00:18:57.935 "supported_io_types": { 00:18:57.935 "read": true, 00:18:57.935 "write": true, 00:18:57.935 "unmap": true, 00:18:57.935 "flush": true, 00:18:57.935 "reset": true, 00:18:57.935 "nvme_admin": false, 00:18:57.935 "nvme_io": false, 00:18:57.935 "nvme_io_md": false, 00:18:57.935 "write_zeroes": true, 00:18:57.935 "zcopy": true, 00:18:57.935 "get_zone_info": false, 00:18:57.935 "zone_management": false, 00:18:57.935 "zone_append": false, 00:18:57.935 "compare": false, 00:18:57.935 "compare_and_write": false, 00:18:57.935 "abort": true, 00:18:57.935 "seek_hole": false, 00:18:57.935 "seek_data": false, 00:18:57.935 "copy": true, 00:18:57.935 "nvme_iov_md": false 00:18:57.935 }, 00:18:57.935 "memory_domains": [ 00:18:57.935 { 00:18:57.935 "dma_device_id": "system", 00:18:57.935 "dma_device_type": 1 00:18:57.935 }, 00:18:57.935 { 00:18:57.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.935 "dma_device_type": 2 00:18:57.935 } 00:18:57.935 ], 00:18:57.935 "driver_specific": {} 00:18:57.935 } 00:18:57.935 ] 00:18:57.935 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:57.935 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:57.935 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:57.935 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:58.192 BaseBdev3 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:58.192 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:58.449 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:59.011 [ 00:18:59.011 { 00:18:59.011 "name": "BaseBdev3", 00:18:59.011 "aliases": [ 00:18:59.011 "e87ee355-0adb-496c-b475-e181ec45ba74" 00:18:59.011 ], 00:18:59.011 "product_name": "Malloc disk", 00:18:59.011 "block_size": 512, 00:18:59.011 "num_blocks": 65536, 00:18:59.011 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:18:59.011 "assigned_rate_limits": { 00:18:59.011 "rw_ios_per_sec": 0, 00:18:59.011 "rw_mbytes_per_sec": 0, 00:18:59.011 "r_mbytes_per_sec": 0, 00:18:59.011 "w_mbytes_per_sec": 0 00:18:59.011 }, 00:18:59.011 "claimed": false, 00:18:59.011 "zoned": false, 00:18:59.011 "supported_io_types": { 00:18:59.011 "read": true, 00:18:59.011 "write": true, 00:18:59.011 "unmap": true, 00:18:59.011 "flush": true, 00:18:59.011 "reset": true, 00:18:59.011 "nvme_admin": false, 00:18:59.011 "nvme_io": false, 00:18:59.011 "nvme_io_md": false, 00:18:59.011 "write_zeroes": true, 00:18:59.011 "zcopy": true, 00:18:59.011 "get_zone_info": false, 00:18:59.011 "zone_management": false, 00:18:59.011 "zone_append": false, 00:18:59.011 "compare": false, 00:18:59.011 "compare_and_write": false, 00:18:59.011 "abort": true, 00:18:59.011 "seek_hole": false, 00:18:59.011 "seek_data": false, 00:18:59.011 "copy": true, 00:18:59.011 "nvme_iov_md": false 00:18:59.011 }, 00:18:59.011 "memory_domains": [ 00:18:59.011 { 00:18:59.011 "dma_device_id": "system", 00:18:59.011 "dma_device_type": 1 00:18:59.011 }, 00:18:59.011 { 00:18:59.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.011 "dma_device_type": 2 00:18:59.011 } 00:18:59.011 ], 00:18:59.011 "driver_specific": {} 00:18:59.011 } 00:18:59.011 ] 00:18:59.011 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:59.011 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:59.011 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:59.011 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:59.268 BaseBdev4 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:59.268 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:59.525 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:59.782 [ 00:18:59.782 { 00:18:59.782 "name": "BaseBdev4", 00:18:59.782 "aliases": [ 00:18:59.782 "b7773315-33b7-4eef-a49e-397d026a399d" 00:18:59.782 ], 00:18:59.782 "product_name": "Malloc disk", 00:18:59.782 "block_size": 512, 00:18:59.782 "num_blocks": 65536, 00:18:59.782 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:18:59.782 "assigned_rate_limits": { 00:18:59.782 "rw_ios_per_sec": 0, 00:18:59.782 "rw_mbytes_per_sec": 0, 00:18:59.782 "r_mbytes_per_sec": 0, 00:18:59.782 "w_mbytes_per_sec": 0 00:18:59.782 }, 00:18:59.782 "claimed": false, 00:18:59.782 "zoned": false, 00:18:59.782 "supported_io_types": { 00:18:59.782 "read": true, 00:18:59.782 "write": true, 00:18:59.782 "unmap": true, 00:18:59.782 "flush": true, 00:18:59.782 "reset": true, 00:18:59.782 "nvme_admin": false, 00:18:59.782 "nvme_io": false, 00:18:59.782 "nvme_io_md": false, 00:18:59.782 "write_zeroes": true, 00:18:59.782 "zcopy": true, 00:18:59.782 "get_zone_info": false, 00:18:59.782 "zone_management": false, 00:18:59.782 "zone_append": false, 00:18:59.782 "compare": false, 00:18:59.782 "compare_and_write": false, 00:18:59.782 "abort": true, 00:18:59.782 "seek_hole": false, 00:18:59.782 "seek_data": false, 00:18:59.782 "copy": true, 00:18:59.782 "nvme_iov_md": false 00:18:59.782 }, 00:18:59.782 "memory_domains": [ 00:18:59.782 { 00:18:59.782 "dma_device_id": "system", 00:18:59.782 "dma_device_type": 1 00:18:59.782 }, 00:18:59.782 { 00:18:59.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.782 "dma_device_type": 2 00:18:59.782 } 00:18:59.782 ], 00:18:59.782 "driver_specific": {} 00:18:59.782 } 00:18:59.782 ] 00:18:59.782 13:37:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:59.782 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:59.782 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:59.782 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:59.782 [2024-07-15 13:37:39.192218] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:59.782 [2024-07-15 13:37:39.192266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:59.782 [2024-07-15 13:37:39.192286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:59.782 [2024-07-15 13:37:39.193661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:59.782 [2024-07-15 13:37:39.193703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.039 "name": "Existed_Raid", 00:19:00.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.039 "strip_size_kb": 64, 00:19:00.039 "state": "configuring", 00:19:00.039 "raid_level": "concat", 00:19:00.039 "superblock": false, 00:19:00.039 "num_base_bdevs": 4, 00:19:00.039 "num_base_bdevs_discovered": 3, 00:19:00.039 "num_base_bdevs_operational": 4, 00:19:00.039 "base_bdevs_list": [ 00:19:00.039 { 00:19:00.039 "name": "BaseBdev1", 00:19:00.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.039 "is_configured": false, 00:19:00.039 "data_offset": 0, 00:19:00.039 "data_size": 0 00:19:00.039 }, 00:19:00.039 { 00:19:00.039 "name": "BaseBdev2", 00:19:00.039 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:00.039 "is_configured": true, 00:19:00.039 "data_offset": 0, 00:19:00.039 "data_size": 65536 00:19:00.039 }, 00:19:00.039 { 00:19:00.039 "name": "BaseBdev3", 00:19:00.039 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:00.039 "is_configured": true, 00:19:00.039 "data_offset": 0, 00:19:00.039 "data_size": 65536 00:19:00.039 }, 00:19:00.039 { 00:19:00.039 "name": "BaseBdev4", 00:19:00.039 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:00.039 "is_configured": true, 00:19:00.039 "data_offset": 0, 00:19:00.039 "data_size": 65536 00:19:00.039 } 00:19:00.039 ] 00:19:00.039 }' 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.039 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:00.972 [2024-07-15 13:37:40.279075] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.972 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.230 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.230 "name": "Existed_Raid", 00:19:01.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.230 "strip_size_kb": 64, 00:19:01.230 "state": "configuring", 00:19:01.230 "raid_level": "concat", 00:19:01.230 "superblock": false, 00:19:01.230 "num_base_bdevs": 4, 00:19:01.230 "num_base_bdevs_discovered": 2, 00:19:01.230 "num_base_bdevs_operational": 4, 00:19:01.230 "base_bdevs_list": [ 00:19:01.230 { 00:19:01.230 "name": "BaseBdev1", 00:19:01.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.230 "is_configured": false, 00:19:01.230 "data_offset": 0, 00:19:01.230 "data_size": 0 00:19:01.230 }, 00:19:01.230 { 00:19:01.230 "name": null, 00:19:01.230 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:01.230 "is_configured": false, 00:19:01.230 "data_offset": 0, 00:19:01.230 "data_size": 65536 00:19:01.230 }, 00:19:01.230 { 00:19:01.230 "name": "BaseBdev3", 00:19:01.230 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:01.230 "is_configured": true, 00:19:01.230 "data_offset": 0, 00:19:01.230 "data_size": 65536 00:19:01.230 }, 00:19:01.230 { 00:19:01.230 "name": "BaseBdev4", 00:19:01.230 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:01.230 "is_configured": true, 00:19:01.230 "data_offset": 0, 00:19:01.230 "data_size": 65536 00:19:01.230 } 00:19:01.230 ] 00:19:01.230 }' 00:19:01.230 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.230 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.795 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.795 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:02.053 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:02.053 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:02.311 [2024-07-15 13:37:41.542979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:02.311 BaseBdev1 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:02.311 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.568 13:37:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:02.826 [ 00:19:02.826 { 00:19:02.826 "name": "BaseBdev1", 00:19:02.826 "aliases": [ 00:19:02.826 "9cbdcb6d-1b57-4d03-a56e-dd48702901db" 00:19:02.826 ], 00:19:02.826 "product_name": "Malloc disk", 00:19:02.826 "block_size": 512, 00:19:02.826 "num_blocks": 65536, 00:19:02.826 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:02.826 "assigned_rate_limits": { 00:19:02.826 "rw_ios_per_sec": 0, 00:19:02.826 "rw_mbytes_per_sec": 0, 00:19:02.826 "r_mbytes_per_sec": 0, 00:19:02.826 "w_mbytes_per_sec": 0 00:19:02.826 }, 00:19:02.826 "claimed": true, 00:19:02.826 "claim_type": "exclusive_write", 00:19:02.826 "zoned": false, 00:19:02.826 "supported_io_types": { 00:19:02.826 "read": true, 00:19:02.826 "write": true, 00:19:02.826 "unmap": true, 00:19:02.826 "flush": true, 00:19:02.826 "reset": true, 00:19:02.826 "nvme_admin": false, 00:19:02.826 "nvme_io": false, 00:19:02.826 "nvme_io_md": false, 00:19:02.826 "write_zeroes": true, 00:19:02.826 "zcopy": true, 00:19:02.826 "get_zone_info": false, 00:19:02.826 "zone_management": false, 00:19:02.826 "zone_append": false, 00:19:02.826 "compare": false, 00:19:02.826 "compare_and_write": false, 00:19:02.826 "abort": true, 00:19:02.826 "seek_hole": false, 00:19:02.826 "seek_data": false, 00:19:02.826 "copy": true, 00:19:02.826 "nvme_iov_md": false 00:19:02.826 }, 00:19:02.826 "memory_domains": [ 00:19:02.826 { 00:19:02.826 "dma_device_id": "system", 00:19:02.826 "dma_device_type": 1 00:19:02.826 }, 00:19:02.826 { 00:19:02.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.826 "dma_device_type": 2 00:19:02.826 } 00:19:02.826 ], 00:19:02.826 "driver_specific": {} 00:19:02.826 } 00:19:02.826 ] 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.826 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.084 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.084 "name": "Existed_Raid", 00:19:03.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.084 "strip_size_kb": 64, 00:19:03.084 "state": "configuring", 00:19:03.084 "raid_level": "concat", 00:19:03.085 "superblock": false, 00:19:03.085 "num_base_bdevs": 4, 00:19:03.085 "num_base_bdevs_discovered": 3, 00:19:03.085 "num_base_bdevs_operational": 4, 00:19:03.085 "base_bdevs_list": [ 00:19:03.085 { 00:19:03.085 "name": "BaseBdev1", 00:19:03.085 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:03.085 "is_configured": true, 00:19:03.085 "data_offset": 0, 00:19:03.085 "data_size": 65536 00:19:03.085 }, 00:19:03.085 { 00:19:03.085 "name": null, 00:19:03.085 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:03.085 "is_configured": false, 00:19:03.085 "data_offset": 0, 00:19:03.085 "data_size": 65536 00:19:03.085 }, 00:19:03.085 { 00:19:03.085 "name": "BaseBdev3", 00:19:03.085 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:03.085 "is_configured": true, 00:19:03.085 "data_offset": 0, 00:19:03.085 "data_size": 65536 00:19:03.085 }, 00:19:03.085 { 00:19:03.085 "name": "BaseBdev4", 00:19:03.085 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:03.085 "is_configured": true, 00:19:03.085 "data_offset": 0, 00:19:03.085 "data_size": 65536 00:19:03.085 } 00:19:03.085 ] 00:19:03.085 }' 00:19:03.085 13:37:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.085 13:37:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.020 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.020 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:04.020 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:04.020 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:04.277 [2024-07-15 13:37:43.624572] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.277 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.535 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.535 "name": "Existed_Raid", 00:19:04.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.535 "strip_size_kb": 64, 00:19:04.535 "state": "configuring", 00:19:04.535 "raid_level": "concat", 00:19:04.535 "superblock": false, 00:19:04.535 "num_base_bdevs": 4, 00:19:04.535 "num_base_bdevs_discovered": 2, 00:19:04.535 "num_base_bdevs_operational": 4, 00:19:04.535 "base_bdevs_list": [ 00:19:04.535 { 00:19:04.535 "name": "BaseBdev1", 00:19:04.535 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:04.535 "is_configured": true, 00:19:04.535 "data_offset": 0, 00:19:04.535 "data_size": 65536 00:19:04.535 }, 00:19:04.535 { 00:19:04.535 "name": null, 00:19:04.535 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:04.535 "is_configured": false, 00:19:04.535 "data_offset": 0, 00:19:04.535 "data_size": 65536 00:19:04.535 }, 00:19:04.535 { 00:19:04.535 "name": null, 00:19:04.535 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:04.535 "is_configured": false, 00:19:04.535 "data_offset": 0, 00:19:04.535 "data_size": 65536 00:19:04.535 }, 00:19:04.535 { 00:19:04.535 "name": "BaseBdev4", 00:19:04.535 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:04.535 "is_configured": true, 00:19:04.535 "data_offset": 0, 00:19:04.535 "data_size": 65536 00:19:04.535 } 00:19:04.535 ] 00:19:04.535 }' 00:19:04.535 13:37:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.535 13:37:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.101 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.101 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:05.359 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:05.359 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:05.617 [2024-07-15 13:37:44.899982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.617 13:37:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.875 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.875 "name": "Existed_Raid", 00:19:05.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.875 "strip_size_kb": 64, 00:19:05.875 "state": "configuring", 00:19:05.875 "raid_level": "concat", 00:19:05.875 "superblock": false, 00:19:05.875 "num_base_bdevs": 4, 00:19:05.875 "num_base_bdevs_discovered": 3, 00:19:05.875 "num_base_bdevs_operational": 4, 00:19:05.875 "base_bdevs_list": [ 00:19:05.875 { 00:19:05.875 "name": "BaseBdev1", 00:19:05.875 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:05.875 "is_configured": true, 00:19:05.875 "data_offset": 0, 00:19:05.875 "data_size": 65536 00:19:05.875 }, 00:19:05.875 { 00:19:05.875 "name": null, 00:19:05.875 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:05.875 "is_configured": false, 00:19:05.875 "data_offset": 0, 00:19:05.875 "data_size": 65536 00:19:05.875 }, 00:19:05.875 { 00:19:05.875 "name": "BaseBdev3", 00:19:05.875 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:05.875 "is_configured": true, 00:19:05.875 "data_offset": 0, 00:19:05.875 "data_size": 65536 00:19:05.875 }, 00:19:05.875 { 00:19:05.875 "name": "BaseBdev4", 00:19:05.875 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:05.875 "is_configured": true, 00:19:05.875 "data_offset": 0, 00:19:05.875 "data_size": 65536 00:19:05.875 } 00:19:05.875 ] 00:19:05.875 }' 00:19:05.875 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.875 13:37:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.441 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:06.441 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.698 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:06.698 13:37:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:06.956 [2024-07-15 13:37:46.151321] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.956 "name": "Existed_Raid", 00:19:06.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.956 "strip_size_kb": 64, 00:19:06.956 "state": "configuring", 00:19:06.956 "raid_level": "concat", 00:19:06.956 "superblock": false, 00:19:06.956 "num_base_bdevs": 4, 00:19:06.956 "num_base_bdevs_discovered": 2, 00:19:06.956 "num_base_bdevs_operational": 4, 00:19:06.956 "base_bdevs_list": [ 00:19:06.956 { 00:19:06.956 "name": null, 00:19:06.956 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:06.956 "is_configured": false, 00:19:06.956 "data_offset": 0, 00:19:06.956 "data_size": 65536 00:19:06.956 }, 00:19:06.956 { 00:19:06.956 "name": null, 00:19:06.956 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:06.956 "is_configured": false, 00:19:06.956 "data_offset": 0, 00:19:06.956 "data_size": 65536 00:19:06.956 }, 00:19:06.956 { 00:19:06.956 "name": "BaseBdev3", 00:19:06.956 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:06.956 "is_configured": true, 00:19:06.956 "data_offset": 0, 00:19:06.956 "data_size": 65536 00:19:06.956 }, 00:19:06.956 { 00:19:06.956 "name": "BaseBdev4", 00:19:06.956 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:06.956 "is_configured": true, 00:19:06.956 "data_offset": 0, 00:19:06.956 "data_size": 65536 00:19:06.956 } 00:19:06.956 ] 00:19:06.956 }' 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.956 13:37:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.889 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.889 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:08.146 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:08.146 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:08.404 [2024-07-15 13:37:47.705916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.404 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.661 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.661 "name": "Existed_Raid", 00:19:08.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.661 "strip_size_kb": 64, 00:19:08.661 "state": "configuring", 00:19:08.661 "raid_level": "concat", 00:19:08.661 "superblock": false, 00:19:08.661 "num_base_bdevs": 4, 00:19:08.661 "num_base_bdevs_discovered": 3, 00:19:08.661 "num_base_bdevs_operational": 4, 00:19:08.661 "base_bdevs_list": [ 00:19:08.662 { 00:19:08.662 "name": null, 00:19:08.662 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:08.662 "is_configured": false, 00:19:08.662 "data_offset": 0, 00:19:08.662 "data_size": 65536 00:19:08.662 }, 00:19:08.662 { 00:19:08.662 "name": "BaseBdev2", 00:19:08.662 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:08.662 "is_configured": true, 00:19:08.662 "data_offset": 0, 00:19:08.662 "data_size": 65536 00:19:08.662 }, 00:19:08.662 { 00:19:08.662 "name": "BaseBdev3", 00:19:08.662 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:08.662 "is_configured": true, 00:19:08.662 "data_offset": 0, 00:19:08.662 "data_size": 65536 00:19:08.662 }, 00:19:08.662 { 00:19:08.662 "name": "BaseBdev4", 00:19:08.662 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:08.662 "is_configured": true, 00:19:08.662 "data_offset": 0, 00:19:08.662 "data_size": 65536 00:19:08.662 } 00:19:08.662 ] 00:19:08.662 }' 00:19:08.662 13:37:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.662 13:37:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.233 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.233 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:09.516 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:09.516 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.516 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:09.789 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9cbdcb6d-1b57-4d03-a56e-dd48702901db 00:19:10.047 [2024-07-15 13:37:49.218496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:10.047 [2024-07-15 13:37:49.218537] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f4b040 00:19:10.047 [2024-07-15 13:37:49.218546] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:10.047 [2024-07-15 13:37:49.218753] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f46a70 00:19:10.047 [2024-07-15 13:37:49.218872] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f4b040 00:19:10.047 [2024-07-15 13:37:49.218881] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f4b040 00:19:10.047 [2024-07-15 13:37:49.219054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.047 NewBaseBdev 00:19:10.047 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:10.047 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:10.047 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:10.048 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:10.048 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:10.048 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:10.048 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:10.305 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:10.305 [ 00:19:10.305 { 00:19:10.305 "name": "NewBaseBdev", 00:19:10.305 "aliases": [ 00:19:10.305 "9cbdcb6d-1b57-4d03-a56e-dd48702901db" 00:19:10.305 ], 00:19:10.305 "product_name": "Malloc disk", 00:19:10.305 "block_size": 512, 00:19:10.305 "num_blocks": 65536, 00:19:10.305 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:10.305 "assigned_rate_limits": { 00:19:10.305 "rw_ios_per_sec": 0, 00:19:10.305 "rw_mbytes_per_sec": 0, 00:19:10.305 "r_mbytes_per_sec": 0, 00:19:10.305 "w_mbytes_per_sec": 0 00:19:10.305 }, 00:19:10.305 "claimed": true, 00:19:10.305 "claim_type": "exclusive_write", 00:19:10.305 "zoned": false, 00:19:10.305 "supported_io_types": { 00:19:10.305 "read": true, 00:19:10.305 "write": true, 00:19:10.305 "unmap": true, 00:19:10.305 "flush": true, 00:19:10.305 "reset": true, 00:19:10.305 "nvme_admin": false, 00:19:10.305 "nvme_io": false, 00:19:10.306 "nvme_io_md": false, 00:19:10.306 "write_zeroes": true, 00:19:10.306 "zcopy": true, 00:19:10.306 "get_zone_info": false, 00:19:10.306 "zone_management": false, 00:19:10.306 "zone_append": false, 00:19:10.306 "compare": false, 00:19:10.306 "compare_and_write": false, 00:19:10.306 "abort": true, 00:19:10.306 "seek_hole": false, 00:19:10.306 "seek_data": false, 00:19:10.306 "copy": true, 00:19:10.306 "nvme_iov_md": false 00:19:10.306 }, 00:19:10.306 "memory_domains": [ 00:19:10.306 { 00:19:10.306 "dma_device_id": "system", 00:19:10.306 "dma_device_type": 1 00:19:10.306 }, 00:19:10.306 { 00:19:10.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.306 "dma_device_type": 2 00:19:10.306 } 00:19:10.306 ], 00:19:10.306 "driver_specific": {} 00:19:10.306 } 00:19:10.306 ] 00:19:10.563 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:10.563 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:10.563 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.564 "name": "Existed_Raid", 00:19:10.564 "uuid": "355bcfff-3da0-483b-89a7-7223b88783f3", 00:19:10.564 "strip_size_kb": 64, 00:19:10.564 "state": "online", 00:19:10.564 "raid_level": "concat", 00:19:10.564 "superblock": false, 00:19:10.564 "num_base_bdevs": 4, 00:19:10.564 "num_base_bdevs_discovered": 4, 00:19:10.564 "num_base_bdevs_operational": 4, 00:19:10.564 "base_bdevs_list": [ 00:19:10.564 { 00:19:10.564 "name": "NewBaseBdev", 00:19:10.564 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:10.564 "is_configured": true, 00:19:10.564 "data_offset": 0, 00:19:10.564 "data_size": 65536 00:19:10.564 }, 00:19:10.564 { 00:19:10.564 "name": "BaseBdev2", 00:19:10.564 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:10.564 "is_configured": true, 00:19:10.564 "data_offset": 0, 00:19:10.564 "data_size": 65536 00:19:10.564 }, 00:19:10.564 { 00:19:10.564 "name": "BaseBdev3", 00:19:10.564 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:10.564 "is_configured": true, 00:19:10.564 "data_offset": 0, 00:19:10.564 "data_size": 65536 00:19:10.564 }, 00:19:10.564 { 00:19:10.564 "name": "BaseBdev4", 00:19:10.564 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:10.564 "is_configured": true, 00:19:10.564 "data_offset": 0, 00:19:10.564 "data_size": 65536 00:19:10.564 } 00:19:10.564 ] 00:19:10.564 }' 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.564 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:11.499 [2024-07-15 13:37:50.795021] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:11.499 "name": "Existed_Raid", 00:19:11.499 "aliases": [ 00:19:11.499 "355bcfff-3da0-483b-89a7-7223b88783f3" 00:19:11.499 ], 00:19:11.499 "product_name": "Raid Volume", 00:19:11.499 "block_size": 512, 00:19:11.499 "num_blocks": 262144, 00:19:11.499 "uuid": "355bcfff-3da0-483b-89a7-7223b88783f3", 00:19:11.499 "assigned_rate_limits": { 00:19:11.499 "rw_ios_per_sec": 0, 00:19:11.499 "rw_mbytes_per_sec": 0, 00:19:11.499 "r_mbytes_per_sec": 0, 00:19:11.499 "w_mbytes_per_sec": 0 00:19:11.499 }, 00:19:11.499 "claimed": false, 00:19:11.499 "zoned": false, 00:19:11.499 "supported_io_types": { 00:19:11.499 "read": true, 00:19:11.499 "write": true, 00:19:11.499 "unmap": true, 00:19:11.499 "flush": true, 00:19:11.499 "reset": true, 00:19:11.499 "nvme_admin": false, 00:19:11.499 "nvme_io": false, 00:19:11.499 "nvme_io_md": false, 00:19:11.499 "write_zeroes": true, 00:19:11.499 "zcopy": false, 00:19:11.499 "get_zone_info": false, 00:19:11.499 "zone_management": false, 00:19:11.499 "zone_append": false, 00:19:11.499 "compare": false, 00:19:11.499 "compare_and_write": false, 00:19:11.499 "abort": false, 00:19:11.499 "seek_hole": false, 00:19:11.499 "seek_data": false, 00:19:11.499 "copy": false, 00:19:11.499 "nvme_iov_md": false 00:19:11.499 }, 00:19:11.499 "memory_domains": [ 00:19:11.499 { 00:19:11.499 "dma_device_id": "system", 00:19:11.499 "dma_device_type": 1 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.499 "dma_device_type": 2 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "system", 00:19:11.499 "dma_device_type": 1 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.499 "dma_device_type": 2 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "system", 00:19:11.499 "dma_device_type": 1 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.499 "dma_device_type": 2 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "system", 00:19:11.499 "dma_device_type": 1 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.499 "dma_device_type": 2 00:19:11.499 } 00:19:11.499 ], 00:19:11.499 "driver_specific": { 00:19:11.499 "raid": { 00:19:11.499 "uuid": "355bcfff-3da0-483b-89a7-7223b88783f3", 00:19:11.499 "strip_size_kb": 64, 00:19:11.499 "state": "online", 00:19:11.499 "raid_level": "concat", 00:19:11.499 "superblock": false, 00:19:11.499 "num_base_bdevs": 4, 00:19:11.499 "num_base_bdevs_discovered": 4, 00:19:11.499 "num_base_bdevs_operational": 4, 00:19:11.499 "base_bdevs_list": [ 00:19:11.499 { 00:19:11.499 "name": "NewBaseBdev", 00:19:11.499 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:11.499 "is_configured": true, 00:19:11.499 "data_offset": 0, 00:19:11.499 "data_size": 65536 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "name": "BaseBdev2", 00:19:11.499 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:11.499 "is_configured": true, 00:19:11.499 "data_offset": 0, 00:19:11.499 "data_size": 65536 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "name": "BaseBdev3", 00:19:11.499 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:11.499 "is_configured": true, 00:19:11.499 "data_offset": 0, 00:19:11.499 "data_size": 65536 00:19:11.499 }, 00:19:11.499 { 00:19:11.499 "name": "BaseBdev4", 00:19:11.499 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:11.499 "is_configured": true, 00:19:11.499 "data_offset": 0, 00:19:11.499 "data_size": 65536 00:19:11.499 } 00:19:11.499 ] 00:19:11.499 } 00:19:11.499 } 00:19:11.499 }' 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:11.499 BaseBdev2 00:19:11.499 BaseBdev3 00:19:11.499 BaseBdev4' 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:11.499 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.757 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.757 "name": "NewBaseBdev", 00:19:11.757 "aliases": [ 00:19:11.757 "9cbdcb6d-1b57-4d03-a56e-dd48702901db" 00:19:11.757 ], 00:19:11.757 "product_name": "Malloc disk", 00:19:11.757 "block_size": 512, 00:19:11.757 "num_blocks": 65536, 00:19:11.757 "uuid": "9cbdcb6d-1b57-4d03-a56e-dd48702901db", 00:19:11.757 "assigned_rate_limits": { 00:19:11.757 "rw_ios_per_sec": 0, 00:19:11.757 "rw_mbytes_per_sec": 0, 00:19:11.757 "r_mbytes_per_sec": 0, 00:19:11.757 "w_mbytes_per_sec": 0 00:19:11.757 }, 00:19:11.757 "claimed": true, 00:19:11.757 "claim_type": "exclusive_write", 00:19:11.757 "zoned": false, 00:19:11.757 "supported_io_types": { 00:19:11.757 "read": true, 00:19:11.757 "write": true, 00:19:11.757 "unmap": true, 00:19:11.757 "flush": true, 00:19:11.757 "reset": true, 00:19:11.757 "nvme_admin": false, 00:19:11.757 "nvme_io": false, 00:19:11.757 "nvme_io_md": false, 00:19:11.757 "write_zeroes": true, 00:19:11.757 "zcopy": true, 00:19:11.757 "get_zone_info": false, 00:19:11.757 "zone_management": false, 00:19:11.757 "zone_append": false, 00:19:11.757 "compare": false, 00:19:11.757 "compare_and_write": false, 00:19:11.757 "abort": true, 00:19:11.757 "seek_hole": false, 00:19:11.757 "seek_data": false, 00:19:11.757 "copy": true, 00:19:11.757 "nvme_iov_md": false 00:19:11.757 }, 00:19:11.757 "memory_domains": [ 00:19:11.757 { 00:19:11.757 "dma_device_id": "system", 00:19:11.757 "dma_device_type": 1 00:19:11.757 }, 00:19:11.757 { 00:19:11.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.757 "dma_device_type": 2 00:19:11.757 } 00:19:11.757 ], 00:19:11.757 "driver_specific": {} 00:19:11.757 }' 00:19:11.757 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.757 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.014 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.272 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.272 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.272 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.272 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.530 "name": "BaseBdev2", 00:19:12.530 "aliases": [ 00:19:12.530 "071b8d6c-39e5-47f7-9c0d-1ffb525f5061" 00:19:12.530 ], 00:19:12.530 "product_name": "Malloc disk", 00:19:12.530 "block_size": 512, 00:19:12.530 "num_blocks": 65536, 00:19:12.530 "uuid": "071b8d6c-39e5-47f7-9c0d-1ffb525f5061", 00:19:12.530 "assigned_rate_limits": { 00:19:12.530 "rw_ios_per_sec": 0, 00:19:12.530 "rw_mbytes_per_sec": 0, 00:19:12.530 "r_mbytes_per_sec": 0, 00:19:12.530 "w_mbytes_per_sec": 0 00:19:12.530 }, 00:19:12.530 "claimed": true, 00:19:12.530 "claim_type": "exclusive_write", 00:19:12.530 "zoned": false, 00:19:12.530 "supported_io_types": { 00:19:12.530 "read": true, 00:19:12.530 "write": true, 00:19:12.530 "unmap": true, 00:19:12.530 "flush": true, 00:19:12.530 "reset": true, 00:19:12.530 "nvme_admin": false, 00:19:12.530 "nvme_io": false, 00:19:12.530 "nvme_io_md": false, 00:19:12.530 "write_zeroes": true, 00:19:12.530 "zcopy": true, 00:19:12.530 "get_zone_info": false, 00:19:12.530 "zone_management": false, 00:19:12.530 "zone_append": false, 00:19:12.530 "compare": false, 00:19:12.530 "compare_and_write": false, 00:19:12.530 "abort": true, 00:19:12.530 "seek_hole": false, 00:19:12.530 "seek_data": false, 00:19:12.530 "copy": true, 00:19:12.530 "nvme_iov_md": false 00:19:12.530 }, 00:19:12.530 "memory_domains": [ 00:19:12.530 { 00:19:12.530 "dma_device_id": "system", 00:19:12.530 "dma_device_type": 1 00:19:12.530 }, 00:19:12.530 { 00:19:12.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.530 "dma_device_type": 2 00:19:12.530 } 00:19:12.530 ], 00:19:12.530 "driver_specific": {} 00:19:12.530 }' 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.530 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.788 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.788 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.788 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.788 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.788 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.788 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:12.788 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.046 "name": "BaseBdev3", 00:19:13.046 "aliases": [ 00:19:13.046 "e87ee355-0adb-496c-b475-e181ec45ba74" 00:19:13.046 ], 00:19:13.046 "product_name": "Malloc disk", 00:19:13.046 "block_size": 512, 00:19:13.046 "num_blocks": 65536, 00:19:13.046 "uuid": "e87ee355-0adb-496c-b475-e181ec45ba74", 00:19:13.046 "assigned_rate_limits": { 00:19:13.046 "rw_ios_per_sec": 0, 00:19:13.046 "rw_mbytes_per_sec": 0, 00:19:13.046 "r_mbytes_per_sec": 0, 00:19:13.046 "w_mbytes_per_sec": 0 00:19:13.046 }, 00:19:13.046 "claimed": true, 00:19:13.046 "claim_type": "exclusive_write", 00:19:13.046 "zoned": false, 00:19:13.046 "supported_io_types": { 00:19:13.046 "read": true, 00:19:13.046 "write": true, 00:19:13.046 "unmap": true, 00:19:13.046 "flush": true, 00:19:13.046 "reset": true, 00:19:13.046 "nvme_admin": false, 00:19:13.046 "nvme_io": false, 00:19:13.046 "nvme_io_md": false, 00:19:13.046 "write_zeroes": true, 00:19:13.046 "zcopy": true, 00:19:13.046 "get_zone_info": false, 00:19:13.046 "zone_management": false, 00:19:13.046 "zone_append": false, 00:19:13.046 "compare": false, 00:19:13.046 "compare_and_write": false, 00:19:13.046 "abort": true, 00:19:13.046 "seek_hole": false, 00:19:13.046 "seek_data": false, 00:19:13.046 "copy": true, 00:19:13.046 "nvme_iov_md": false 00:19:13.046 }, 00:19:13.046 "memory_domains": [ 00:19:13.046 { 00:19:13.046 "dma_device_id": "system", 00:19:13.046 "dma_device_type": 1 00:19:13.046 }, 00:19:13.046 { 00:19:13.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.046 "dma_device_type": 2 00:19:13.046 } 00:19:13.046 ], 00:19:13.046 "driver_specific": {} 00:19:13.046 }' 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.046 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.047 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.047 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:13.305 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.562 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.562 "name": "BaseBdev4", 00:19:13.562 "aliases": [ 00:19:13.562 "b7773315-33b7-4eef-a49e-397d026a399d" 00:19:13.562 ], 00:19:13.562 "product_name": "Malloc disk", 00:19:13.562 "block_size": 512, 00:19:13.562 "num_blocks": 65536, 00:19:13.562 "uuid": "b7773315-33b7-4eef-a49e-397d026a399d", 00:19:13.562 "assigned_rate_limits": { 00:19:13.562 "rw_ios_per_sec": 0, 00:19:13.562 "rw_mbytes_per_sec": 0, 00:19:13.562 "r_mbytes_per_sec": 0, 00:19:13.562 "w_mbytes_per_sec": 0 00:19:13.562 }, 00:19:13.562 "claimed": true, 00:19:13.562 "claim_type": "exclusive_write", 00:19:13.562 "zoned": false, 00:19:13.562 "supported_io_types": { 00:19:13.562 "read": true, 00:19:13.562 "write": true, 00:19:13.562 "unmap": true, 00:19:13.562 "flush": true, 00:19:13.562 "reset": true, 00:19:13.562 "nvme_admin": false, 00:19:13.562 "nvme_io": false, 00:19:13.562 "nvme_io_md": false, 00:19:13.563 "write_zeroes": true, 00:19:13.563 "zcopy": true, 00:19:13.563 "get_zone_info": false, 00:19:13.563 "zone_management": false, 00:19:13.563 "zone_append": false, 00:19:13.563 "compare": false, 00:19:13.563 "compare_and_write": false, 00:19:13.563 "abort": true, 00:19:13.563 "seek_hole": false, 00:19:13.563 "seek_data": false, 00:19:13.563 "copy": true, 00:19:13.563 "nvme_iov_md": false 00:19:13.563 }, 00:19:13.563 "memory_domains": [ 00:19:13.563 { 00:19:13.563 "dma_device_id": "system", 00:19:13.563 "dma_device_type": 1 00:19:13.563 }, 00:19:13.563 { 00:19:13.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.563 "dma_device_type": 2 00:19:13.563 } 00:19:13.563 ], 00:19:13.563 "driver_specific": {} 00:19:13.563 }' 00:19:13.563 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.563 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.563 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.563 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.820 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.820 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.820 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.821 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.079 [2024-07-15 13:37:53.433714] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.079 [2024-07-15 13:37:53.433747] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:14.079 [2024-07-15 13:37:53.433812] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:14.079 [2024-07-15 13:37:53.433875] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:14.079 [2024-07-15 13:37:53.433887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f4b040 name Existed_Raid, state offline 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2142853 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2142853 ']' 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2142853 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2142853 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2142853' 00:19:14.079 killing process with pid 2142853 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2142853 00:19:14.079 [2024-07-15 13:37:53.502663] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:14.079 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2142853 00:19:14.337 [2024-07-15 13:37:53.545547] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:14.595 00:19:14.595 real 0m33.839s 00:19:14.595 user 1m2.153s 00:19:14.595 sys 0m5.969s 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.595 ************************************ 00:19:14.595 END TEST raid_state_function_test 00:19:14.595 ************************************ 00:19:14.595 13:37:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:14.595 13:37:53 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:14.595 13:37:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:14.595 13:37:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:14.595 13:37:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:14.595 ************************************ 00:19:14.595 START TEST raid_state_function_test_sb 00:19:14.595 ************************************ 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2147903 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2147903' 00:19:14.595 Process raid pid: 2147903 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2147903 /var/tmp/spdk-raid.sock 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2147903 ']' 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:14.595 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:14.595 13:37:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:14.595 [2024-07-15 13:37:53.932723] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:19:14.595 [2024-07-15 13:37:53.932791] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:14.854 [2024-07-15 13:37:54.064249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:14.854 [2024-07-15 13:37:54.166047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:14.854 [2024-07-15 13:37:54.225684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:14.854 [2024-07-15 13:37:54.225719] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:15.787 13:37:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:15.787 13:37:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:15.787 13:37:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:15.787 [2024-07-15 13:37:55.088045] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:15.787 [2024-07-15 13:37:55.088090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:15.787 [2024-07-15 13:37:55.088101] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:15.787 [2024-07-15 13:37:55.088113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:15.787 [2024-07-15 13:37:55.088121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:15.787 [2024-07-15 13:37:55.088133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:15.787 [2024-07-15 13:37:55.088142] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:15.787 [2024-07-15 13:37:55.088153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.787 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.045 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.045 "name": "Existed_Raid", 00:19:16.045 "uuid": "49bc609e-d8d9-4105-8bce-6c685a46d39a", 00:19:16.045 "strip_size_kb": 64, 00:19:16.045 "state": "configuring", 00:19:16.045 "raid_level": "concat", 00:19:16.045 "superblock": true, 00:19:16.045 "num_base_bdevs": 4, 00:19:16.045 "num_base_bdevs_discovered": 0, 00:19:16.045 "num_base_bdevs_operational": 4, 00:19:16.045 "base_bdevs_list": [ 00:19:16.045 { 00:19:16.045 "name": "BaseBdev1", 00:19:16.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.045 "is_configured": false, 00:19:16.045 "data_offset": 0, 00:19:16.045 "data_size": 0 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": "BaseBdev2", 00:19:16.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.045 "is_configured": false, 00:19:16.045 "data_offset": 0, 00:19:16.045 "data_size": 0 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": "BaseBdev3", 00:19:16.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.045 "is_configured": false, 00:19:16.045 "data_offset": 0, 00:19:16.045 "data_size": 0 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": "BaseBdev4", 00:19:16.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.045 "is_configured": false, 00:19:16.045 "data_offset": 0, 00:19:16.045 "data_size": 0 00:19:16.045 } 00:19:16.045 ] 00:19:16.045 }' 00:19:16.045 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.045 13:37:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.610 13:37:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:16.868 [2024-07-15 13:37:56.166936] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:16.868 [2024-07-15 13:37:56.166966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d02aa0 name Existed_Raid, state configuring 00:19:16.868 13:37:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.125 [2024-07-15 13:37:56.415616] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:17.125 [2024-07-15 13:37:56.415644] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:17.125 [2024-07-15 13:37:56.415653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.125 [2024-07-15 13:37:56.415665] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.125 [2024-07-15 13:37:56.415673] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.125 [2024-07-15 13:37:56.415685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.125 [2024-07-15 13:37:56.415694] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.125 [2024-07-15 13:37:56.415705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.125 13:37:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:17.382 [2024-07-15 13:37:56.670107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.382 BaseBdev1 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.382 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:17.639 13:37:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:17.897 [ 00:19:17.897 { 00:19:17.897 "name": "BaseBdev1", 00:19:17.897 "aliases": [ 00:19:17.897 "cb3cb7c6-547b-47b6-b977-3fb8b773443e" 00:19:17.897 ], 00:19:17.897 "product_name": "Malloc disk", 00:19:17.897 "block_size": 512, 00:19:17.897 "num_blocks": 65536, 00:19:17.897 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:17.897 "assigned_rate_limits": { 00:19:17.897 "rw_ios_per_sec": 0, 00:19:17.897 "rw_mbytes_per_sec": 0, 00:19:17.897 "r_mbytes_per_sec": 0, 00:19:17.897 "w_mbytes_per_sec": 0 00:19:17.897 }, 00:19:17.897 "claimed": true, 00:19:17.897 "claim_type": "exclusive_write", 00:19:17.897 "zoned": false, 00:19:17.897 "supported_io_types": { 00:19:17.897 "read": true, 00:19:17.897 "write": true, 00:19:17.897 "unmap": true, 00:19:17.897 "flush": true, 00:19:17.897 "reset": true, 00:19:17.897 "nvme_admin": false, 00:19:17.897 "nvme_io": false, 00:19:17.897 "nvme_io_md": false, 00:19:17.897 "write_zeroes": true, 00:19:17.897 "zcopy": true, 00:19:17.897 "get_zone_info": false, 00:19:17.897 "zone_management": false, 00:19:17.897 "zone_append": false, 00:19:17.897 "compare": false, 00:19:17.897 "compare_and_write": false, 00:19:17.897 "abort": true, 00:19:17.897 "seek_hole": false, 00:19:17.897 "seek_data": false, 00:19:17.897 "copy": true, 00:19:17.897 "nvme_iov_md": false 00:19:17.897 }, 00:19:17.897 "memory_domains": [ 00:19:17.897 { 00:19:17.897 "dma_device_id": "system", 00:19:17.897 "dma_device_type": 1 00:19:17.897 }, 00:19:17.897 { 00:19:17.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.897 "dma_device_type": 2 00:19:17.897 } 00:19:17.897 ], 00:19:17.897 "driver_specific": {} 00:19:17.897 } 00:19:17.897 ] 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.897 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.155 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.155 "name": "Existed_Raid", 00:19:18.155 "uuid": "82bd263c-b420-49d7-8a5a-5f2120d938bb", 00:19:18.155 "strip_size_kb": 64, 00:19:18.155 "state": "configuring", 00:19:18.155 "raid_level": "concat", 00:19:18.155 "superblock": true, 00:19:18.155 "num_base_bdevs": 4, 00:19:18.155 "num_base_bdevs_discovered": 1, 00:19:18.155 "num_base_bdevs_operational": 4, 00:19:18.155 "base_bdevs_list": [ 00:19:18.155 { 00:19:18.155 "name": "BaseBdev1", 00:19:18.155 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:18.155 "is_configured": true, 00:19:18.155 "data_offset": 2048, 00:19:18.155 "data_size": 63488 00:19:18.155 }, 00:19:18.155 { 00:19:18.155 "name": "BaseBdev2", 00:19:18.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.155 "is_configured": false, 00:19:18.155 "data_offset": 0, 00:19:18.155 "data_size": 0 00:19:18.155 }, 00:19:18.155 { 00:19:18.155 "name": "BaseBdev3", 00:19:18.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.155 "is_configured": false, 00:19:18.155 "data_offset": 0, 00:19:18.155 "data_size": 0 00:19:18.155 }, 00:19:18.155 { 00:19:18.155 "name": "BaseBdev4", 00:19:18.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.155 "is_configured": false, 00:19:18.155 "data_offset": 0, 00:19:18.155 "data_size": 0 00:19:18.155 } 00:19:18.155 ] 00:19:18.155 }' 00:19:18.155 13:37:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.155 13:37:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.720 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:18.978 [2024-07-15 13:37:58.222202] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:18.978 [2024-07-15 13:37:58.222247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d02310 name Existed_Raid, state configuring 00:19:18.978 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:19.236 [2024-07-15 13:37:58.466906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:19.236 [2024-07-15 13:37:58.468359] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:19.236 [2024-07-15 13:37:58.468393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:19.236 [2024-07-15 13:37:58.468404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:19.236 [2024-07-15 13:37:58.468417] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:19.236 [2024-07-15 13:37:58.468426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:19.236 [2024-07-15 13:37:58.468437] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.236 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.493 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.493 "name": "Existed_Raid", 00:19:19.493 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:19.493 "strip_size_kb": 64, 00:19:19.493 "state": "configuring", 00:19:19.493 "raid_level": "concat", 00:19:19.493 "superblock": true, 00:19:19.493 "num_base_bdevs": 4, 00:19:19.493 "num_base_bdevs_discovered": 1, 00:19:19.493 "num_base_bdevs_operational": 4, 00:19:19.493 "base_bdevs_list": [ 00:19:19.493 { 00:19:19.493 "name": "BaseBdev1", 00:19:19.493 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:19.493 "is_configured": true, 00:19:19.493 "data_offset": 2048, 00:19:19.493 "data_size": 63488 00:19:19.493 }, 00:19:19.493 { 00:19:19.493 "name": "BaseBdev2", 00:19:19.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.493 "is_configured": false, 00:19:19.493 "data_offset": 0, 00:19:19.493 "data_size": 0 00:19:19.493 }, 00:19:19.493 { 00:19:19.493 "name": "BaseBdev3", 00:19:19.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.493 "is_configured": false, 00:19:19.493 "data_offset": 0, 00:19:19.493 "data_size": 0 00:19:19.493 }, 00:19:19.493 { 00:19:19.493 "name": "BaseBdev4", 00:19:19.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.493 "is_configured": false, 00:19:19.493 "data_offset": 0, 00:19:19.493 "data_size": 0 00:19:19.493 } 00:19:19.493 ] 00:19:19.493 }' 00:19:19.493 13:37:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.493 13:37:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.057 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:20.314 [2024-07-15 13:37:59.581234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:20.314 BaseBdev2 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:20.314 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.572 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:20.829 [ 00:19:20.829 { 00:19:20.829 "name": "BaseBdev2", 00:19:20.829 "aliases": [ 00:19:20.829 "307a1eba-074e-4b9b-90a4-d498dc9d5f8f" 00:19:20.829 ], 00:19:20.829 "product_name": "Malloc disk", 00:19:20.829 "block_size": 512, 00:19:20.829 "num_blocks": 65536, 00:19:20.829 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:20.829 "assigned_rate_limits": { 00:19:20.829 "rw_ios_per_sec": 0, 00:19:20.829 "rw_mbytes_per_sec": 0, 00:19:20.829 "r_mbytes_per_sec": 0, 00:19:20.829 "w_mbytes_per_sec": 0 00:19:20.829 }, 00:19:20.829 "claimed": true, 00:19:20.829 "claim_type": "exclusive_write", 00:19:20.829 "zoned": false, 00:19:20.829 "supported_io_types": { 00:19:20.829 "read": true, 00:19:20.829 "write": true, 00:19:20.829 "unmap": true, 00:19:20.829 "flush": true, 00:19:20.829 "reset": true, 00:19:20.829 "nvme_admin": false, 00:19:20.829 "nvme_io": false, 00:19:20.829 "nvme_io_md": false, 00:19:20.829 "write_zeroes": true, 00:19:20.829 "zcopy": true, 00:19:20.829 "get_zone_info": false, 00:19:20.829 "zone_management": false, 00:19:20.829 "zone_append": false, 00:19:20.829 "compare": false, 00:19:20.829 "compare_and_write": false, 00:19:20.829 "abort": true, 00:19:20.829 "seek_hole": false, 00:19:20.829 "seek_data": false, 00:19:20.829 "copy": true, 00:19:20.829 "nvme_iov_md": false 00:19:20.829 }, 00:19:20.829 "memory_domains": [ 00:19:20.829 { 00:19:20.829 "dma_device_id": "system", 00:19:20.829 "dma_device_type": 1 00:19:20.829 }, 00:19:20.829 { 00:19:20.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.829 "dma_device_type": 2 00:19:20.829 } 00:19:20.829 ], 00:19:20.829 "driver_specific": {} 00:19:20.829 } 00:19:20.829 ] 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.829 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.086 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.086 "name": "Existed_Raid", 00:19:21.086 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:21.086 "strip_size_kb": 64, 00:19:21.086 "state": "configuring", 00:19:21.086 "raid_level": "concat", 00:19:21.086 "superblock": true, 00:19:21.086 "num_base_bdevs": 4, 00:19:21.086 "num_base_bdevs_discovered": 2, 00:19:21.086 "num_base_bdevs_operational": 4, 00:19:21.086 "base_bdevs_list": [ 00:19:21.086 { 00:19:21.086 "name": "BaseBdev1", 00:19:21.086 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:21.086 "is_configured": true, 00:19:21.086 "data_offset": 2048, 00:19:21.086 "data_size": 63488 00:19:21.086 }, 00:19:21.086 { 00:19:21.086 "name": "BaseBdev2", 00:19:21.086 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:21.086 "is_configured": true, 00:19:21.086 "data_offset": 2048, 00:19:21.086 "data_size": 63488 00:19:21.086 }, 00:19:21.086 { 00:19:21.086 "name": "BaseBdev3", 00:19:21.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.086 "is_configured": false, 00:19:21.086 "data_offset": 0, 00:19:21.086 "data_size": 0 00:19:21.086 }, 00:19:21.086 { 00:19:21.086 "name": "BaseBdev4", 00:19:21.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.086 "is_configured": false, 00:19:21.086 "data_offset": 0, 00:19:21.086 "data_size": 0 00:19:21.086 } 00:19:21.086 ] 00:19:21.086 }' 00:19:21.086 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.086 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.649 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:21.905 [2024-07-15 13:38:01.164826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:21.905 BaseBdev3 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.905 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:22.162 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:22.162 [ 00:19:22.162 { 00:19:22.162 "name": "BaseBdev3", 00:19:22.162 "aliases": [ 00:19:22.162 "94e13615-de99-4e85-aff5-4e32f93db6a0" 00:19:22.162 ], 00:19:22.162 "product_name": "Malloc disk", 00:19:22.162 "block_size": 512, 00:19:22.162 "num_blocks": 65536, 00:19:22.162 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:22.162 "assigned_rate_limits": { 00:19:22.162 "rw_ios_per_sec": 0, 00:19:22.162 "rw_mbytes_per_sec": 0, 00:19:22.162 "r_mbytes_per_sec": 0, 00:19:22.162 "w_mbytes_per_sec": 0 00:19:22.162 }, 00:19:22.162 "claimed": true, 00:19:22.162 "claim_type": "exclusive_write", 00:19:22.162 "zoned": false, 00:19:22.162 "supported_io_types": { 00:19:22.162 "read": true, 00:19:22.162 "write": true, 00:19:22.162 "unmap": true, 00:19:22.162 "flush": true, 00:19:22.162 "reset": true, 00:19:22.162 "nvme_admin": false, 00:19:22.162 "nvme_io": false, 00:19:22.162 "nvme_io_md": false, 00:19:22.162 "write_zeroes": true, 00:19:22.162 "zcopy": true, 00:19:22.162 "get_zone_info": false, 00:19:22.162 "zone_management": false, 00:19:22.162 "zone_append": false, 00:19:22.162 "compare": false, 00:19:22.162 "compare_and_write": false, 00:19:22.162 "abort": true, 00:19:22.162 "seek_hole": false, 00:19:22.162 "seek_data": false, 00:19:22.162 "copy": true, 00:19:22.162 "nvme_iov_md": false 00:19:22.162 }, 00:19:22.162 "memory_domains": [ 00:19:22.162 { 00:19:22.162 "dma_device_id": "system", 00:19:22.162 "dma_device_type": 1 00:19:22.162 }, 00:19:22.162 { 00:19:22.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.162 "dma_device_type": 2 00:19:22.162 } 00:19:22.162 ], 00:19:22.162 "driver_specific": {} 00:19:22.162 } 00:19:22.162 ] 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.419 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.676 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.676 "name": "Existed_Raid", 00:19:22.676 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:22.676 "strip_size_kb": 64, 00:19:22.676 "state": "configuring", 00:19:22.676 "raid_level": "concat", 00:19:22.676 "superblock": true, 00:19:22.676 "num_base_bdevs": 4, 00:19:22.676 "num_base_bdevs_discovered": 3, 00:19:22.676 "num_base_bdevs_operational": 4, 00:19:22.676 "base_bdevs_list": [ 00:19:22.676 { 00:19:22.676 "name": "BaseBdev1", 00:19:22.676 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:22.676 "is_configured": true, 00:19:22.676 "data_offset": 2048, 00:19:22.676 "data_size": 63488 00:19:22.676 }, 00:19:22.677 { 00:19:22.677 "name": "BaseBdev2", 00:19:22.677 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:22.677 "is_configured": true, 00:19:22.677 "data_offset": 2048, 00:19:22.677 "data_size": 63488 00:19:22.677 }, 00:19:22.677 { 00:19:22.677 "name": "BaseBdev3", 00:19:22.677 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:22.677 "is_configured": true, 00:19:22.677 "data_offset": 2048, 00:19:22.677 "data_size": 63488 00:19:22.677 }, 00:19:22.677 { 00:19:22.677 "name": "BaseBdev4", 00:19:22.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.677 "is_configured": false, 00:19:22.677 "data_offset": 0, 00:19:22.677 "data_size": 0 00:19:22.677 } 00:19:22.677 ] 00:19:22.677 }' 00:19:22.677 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.677 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:23.241 [2024-07-15 13:38:02.640064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:23.241 [2024-07-15 13:38:02.640239] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d03350 00:19:23.241 [2024-07-15 13:38:02.640253] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:23.241 [2024-07-15 13:38:02.640433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d03020 00:19:23.241 [2024-07-15 13:38:02.640551] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d03350 00:19:23.241 [2024-07-15 13:38:02.640561] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d03350 00:19:23.241 [2024-07-15 13:38:02.640651] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:23.241 BaseBdev4 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:23.241 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:23.499 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:23.768 [ 00:19:23.768 { 00:19:23.768 "name": "BaseBdev4", 00:19:23.768 "aliases": [ 00:19:23.768 "9fe27587-4b52-485c-8760-e22548e577f3" 00:19:23.768 ], 00:19:23.768 "product_name": "Malloc disk", 00:19:23.768 "block_size": 512, 00:19:23.768 "num_blocks": 65536, 00:19:23.768 "uuid": "9fe27587-4b52-485c-8760-e22548e577f3", 00:19:23.768 "assigned_rate_limits": { 00:19:23.768 "rw_ios_per_sec": 0, 00:19:23.768 "rw_mbytes_per_sec": 0, 00:19:23.768 "r_mbytes_per_sec": 0, 00:19:23.768 "w_mbytes_per_sec": 0 00:19:23.768 }, 00:19:23.768 "claimed": true, 00:19:23.768 "claim_type": "exclusive_write", 00:19:23.768 "zoned": false, 00:19:23.768 "supported_io_types": { 00:19:23.768 "read": true, 00:19:23.768 "write": true, 00:19:23.768 "unmap": true, 00:19:23.768 "flush": true, 00:19:23.768 "reset": true, 00:19:23.768 "nvme_admin": false, 00:19:23.768 "nvme_io": false, 00:19:23.768 "nvme_io_md": false, 00:19:23.768 "write_zeroes": true, 00:19:23.768 "zcopy": true, 00:19:23.768 "get_zone_info": false, 00:19:23.768 "zone_management": false, 00:19:23.768 "zone_append": false, 00:19:23.768 "compare": false, 00:19:23.768 "compare_and_write": false, 00:19:23.768 "abort": true, 00:19:23.768 "seek_hole": false, 00:19:23.768 "seek_data": false, 00:19:23.768 "copy": true, 00:19:23.768 "nvme_iov_md": false 00:19:23.768 }, 00:19:23.768 "memory_domains": [ 00:19:23.768 { 00:19:23.768 "dma_device_id": "system", 00:19:23.768 "dma_device_type": 1 00:19:23.768 }, 00:19:23.768 { 00:19:23.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.768 "dma_device_type": 2 00:19:23.768 } 00:19:23.768 ], 00:19:23.768 "driver_specific": {} 00:19:23.768 } 00:19:23.768 ] 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.768 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.038 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.038 "name": "Existed_Raid", 00:19:24.038 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:24.038 "strip_size_kb": 64, 00:19:24.038 "state": "online", 00:19:24.039 "raid_level": "concat", 00:19:24.039 "superblock": true, 00:19:24.039 "num_base_bdevs": 4, 00:19:24.039 "num_base_bdevs_discovered": 4, 00:19:24.039 "num_base_bdevs_operational": 4, 00:19:24.039 "base_bdevs_list": [ 00:19:24.039 { 00:19:24.039 "name": "BaseBdev1", 00:19:24.039 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:24.039 "is_configured": true, 00:19:24.039 "data_offset": 2048, 00:19:24.039 "data_size": 63488 00:19:24.039 }, 00:19:24.039 { 00:19:24.039 "name": "BaseBdev2", 00:19:24.039 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:24.039 "is_configured": true, 00:19:24.039 "data_offset": 2048, 00:19:24.039 "data_size": 63488 00:19:24.039 }, 00:19:24.039 { 00:19:24.039 "name": "BaseBdev3", 00:19:24.039 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:24.039 "is_configured": true, 00:19:24.039 "data_offset": 2048, 00:19:24.039 "data_size": 63488 00:19:24.039 }, 00:19:24.039 { 00:19:24.039 "name": "BaseBdev4", 00:19:24.039 "uuid": "9fe27587-4b52-485c-8760-e22548e577f3", 00:19:24.039 "is_configured": true, 00:19:24.039 "data_offset": 2048, 00:19:24.039 "data_size": 63488 00:19:24.039 } 00:19:24.039 ] 00:19:24.039 }' 00:19:24.039 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.039 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:24.601 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:24.857 [2024-07-15 13:38:04.088255] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:24.857 "name": "Existed_Raid", 00:19:24.857 "aliases": [ 00:19:24.857 "0ced30db-71f6-4aa8-9339-1c0832c9da57" 00:19:24.857 ], 00:19:24.857 "product_name": "Raid Volume", 00:19:24.857 "block_size": 512, 00:19:24.857 "num_blocks": 253952, 00:19:24.857 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:24.857 "assigned_rate_limits": { 00:19:24.857 "rw_ios_per_sec": 0, 00:19:24.857 "rw_mbytes_per_sec": 0, 00:19:24.857 "r_mbytes_per_sec": 0, 00:19:24.857 "w_mbytes_per_sec": 0 00:19:24.857 }, 00:19:24.857 "claimed": false, 00:19:24.857 "zoned": false, 00:19:24.857 "supported_io_types": { 00:19:24.857 "read": true, 00:19:24.857 "write": true, 00:19:24.857 "unmap": true, 00:19:24.857 "flush": true, 00:19:24.857 "reset": true, 00:19:24.857 "nvme_admin": false, 00:19:24.857 "nvme_io": false, 00:19:24.857 "nvme_io_md": false, 00:19:24.857 "write_zeroes": true, 00:19:24.857 "zcopy": false, 00:19:24.857 "get_zone_info": false, 00:19:24.857 "zone_management": false, 00:19:24.857 "zone_append": false, 00:19:24.857 "compare": false, 00:19:24.857 "compare_and_write": false, 00:19:24.857 "abort": false, 00:19:24.857 "seek_hole": false, 00:19:24.857 "seek_data": false, 00:19:24.857 "copy": false, 00:19:24.857 "nvme_iov_md": false 00:19:24.857 }, 00:19:24.857 "memory_domains": [ 00:19:24.857 { 00:19:24.857 "dma_device_id": "system", 00:19:24.857 "dma_device_type": 1 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.857 "dma_device_type": 2 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "system", 00:19:24.857 "dma_device_type": 1 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.857 "dma_device_type": 2 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "system", 00:19:24.857 "dma_device_type": 1 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.857 "dma_device_type": 2 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "system", 00:19:24.857 "dma_device_type": 1 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.857 "dma_device_type": 2 00:19:24.857 } 00:19:24.857 ], 00:19:24.857 "driver_specific": { 00:19:24.857 "raid": { 00:19:24.857 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:24.857 "strip_size_kb": 64, 00:19:24.857 "state": "online", 00:19:24.857 "raid_level": "concat", 00:19:24.857 "superblock": true, 00:19:24.857 "num_base_bdevs": 4, 00:19:24.857 "num_base_bdevs_discovered": 4, 00:19:24.857 "num_base_bdevs_operational": 4, 00:19:24.857 "base_bdevs_list": [ 00:19:24.857 { 00:19:24.857 "name": "BaseBdev1", 00:19:24.857 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:24.857 "is_configured": true, 00:19:24.857 "data_offset": 2048, 00:19:24.857 "data_size": 63488 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "name": "BaseBdev2", 00:19:24.857 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:24.857 "is_configured": true, 00:19:24.857 "data_offset": 2048, 00:19:24.857 "data_size": 63488 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "name": "BaseBdev3", 00:19:24.857 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:24.857 "is_configured": true, 00:19:24.857 "data_offset": 2048, 00:19:24.857 "data_size": 63488 00:19:24.857 }, 00:19:24.857 { 00:19:24.857 "name": "BaseBdev4", 00:19:24.857 "uuid": "9fe27587-4b52-485c-8760-e22548e577f3", 00:19:24.857 "is_configured": true, 00:19:24.857 "data_offset": 2048, 00:19:24.857 "data_size": 63488 00:19:24.857 } 00:19:24.857 ] 00:19:24.857 } 00:19:24.857 } 00:19:24.857 }' 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:24.857 BaseBdev2 00:19:24.857 BaseBdev3 00:19:24.857 BaseBdev4' 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:24.857 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.120 "name": "BaseBdev1", 00:19:25.120 "aliases": [ 00:19:25.120 "cb3cb7c6-547b-47b6-b977-3fb8b773443e" 00:19:25.120 ], 00:19:25.120 "product_name": "Malloc disk", 00:19:25.120 "block_size": 512, 00:19:25.120 "num_blocks": 65536, 00:19:25.120 "uuid": "cb3cb7c6-547b-47b6-b977-3fb8b773443e", 00:19:25.120 "assigned_rate_limits": { 00:19:25.120 "rw_ios_per_sec": 0, 00:19:25.120 "rw_mbytes_per_sec": 0, 00:19:25.120 "r_mbytes_per_sec": 0, 00:19:25.120 "w_mbytes_per_sec": 0 00:19:25.120 }, 00:19:25.120 "claimed": true, 00:19:25.120 "claim_type": "exclusive_write", 00:19:25.120 "zoned": false, 00:19:25.120 "supported_io_types": { 00:19:25.120 "read": true, 00:19:25.120 "write": true, 00:19:25.120 "unmap": true, 00:19:25.120 "flush": true, 00:19:25.120 "reset": true, 00:19:25.120 "nvme_admin": false, 00:19:25.120 "nvme_io": false, 00:19:25.120 "nvme_io_md": false, 00:19:25.120 "write_zeroes": true, 00:19:25.120 "zcopy": true, 00:19:25.120 "get_zone_info": false, 00:19:25.120 "zone_management": false, 00:19:25.120 "zone_append": false, 00:19:25.120 "compare": false, 00:19:25.120 "compare_and_write": false, 00:19:25.120 "abort": true, 00:19:25.120 "seek_hole": false, 00:19:25.120 "seek_data": false, 00:19:25.120 "copy": true, 00:19:25.120 "nvme_iov_md": false 00:19:25.120 }, 00:19:25.120 "memory_domains": [ 00:19:25.120 { 00:19:25.120 "dma_device_id": "system", 00:19:25.120 "dma_device_type": 1 00:19:25.120 }, 00:19:25.120 { 00:19:25.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.120 "dma_device_type": 2 00:19:25.120 } 00:19:25.120 ], 00:19:25.120 "driver_specific": {} 00:19:25.120 }' 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.120 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:25.380 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:25.636 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.636 "name": "BaseBdev2", 00:19:25.636 "aliases": [ 00:19:25.636 "307a1eba-074e-4b9b-90a4-d498dc9d5f8f" 00:19:25.636 ], 00:19:25.636 "product_name": "Malloc disk", 00:19:25.636 "block_size": 512, 00:19:25.636 "num_blocks": 65536, 00:19:25.636 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:25.636 "assigned_rate_limits": { 00:19:25.636 "rw_ios_per_sec": 0, 00:19:25.636 "rw_mbytes_per_sec": 0, 00:19:25.636 "r_mbytes_per_sec": 0, 00:19:25.636 "w_mbytes_per_sec": 0 00:19:25.636 }, 00:19:25.636 "claimed": true, 00:19:25.636 "claim_type": "exclusive_write", 00:19:25.636 "zoned": false, 00:19:25.636 "supported_io_types": { 00:19:25.636 "read": true, 00:19:25.636 "write": true, 00:19:25.636 "unmap": true, 00:19:25.636 "flush": true, 00:19:25.636 "reset": true, 00:19:25.636 "nvme_admin": false, 00:19:25.636 "nvme_io": false, 00:19:25.636 "nvme_io_md": false, 00:19:25.636 "write_zeroes": true, 00:19:25.636 "zcopy": true, 00:19:25.636 "get_zone_info": false, 00:19:25.636 "zone_management": false, 00:19:25.636 "zone_append": false, 00:19:25.636 "compare": false, 00:19:25.636 "compare_and_write": false, 00:19:25.636 "abort": true, 00:19:25.636 "seek_hole": false, 00:19:25.636 "seek_data": false, 00:19:25.636 "copy": true, 00:19:25.636 "nvme_iov_md": false 00:19:25.636 }, 00:19:25.636 "memory_domains": [ 00:19:25.636 { 00:19:25.636 "dma_device_id": "system", 00:19:25.636 "dma_device_type": 1 00:19:25.636 }, 00:19:25.636 { 00:19:25.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.636 "dma_device_type": 2 00:19:25.636 } 00:19:25.636 ], 00:19:25.636 "driver_specific": {} 00:19:25.636 }' 00:19:25.636 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.636 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.636 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.636 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.636 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:25.893 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:26.150 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:26.150 "name": "BaseBdev3", 00:19:26.150 "aliases": [ 00:19:26.150 "94e13615-de99-4e85-aff5-4e32f93db6a0" 00:19:26.150 ], 00:19:26.150 "product_name": "Malloc disk", 00:19:26.150 "block_size": 512, 00:19:26.150 "num_blocks": 65536, 00:19:26.150 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:26.150 "assigned_rate_limits": { 00:19:26.150 "rw_ios_per_sec": 0, 00:19:26.150 "rw_mbytes_per_sec": 0, 00:19:26.150 "r_mbytes_per_sec": 0, 00:19:26.150 "w_mbytes_per_sec": 0 00:19:26.150 }, 00:19:26.150 "claimed": true, 00:19:26.150 "claim_type": "exclusive_write", 00:19:26.150 "zoned": false, 00:19:26.150 "supported_io_types": { 00:19:26.150 "read": true, 00:19:26.150 "write": true, 00:19:26.150 "unmap": true, 00:19:26.150 "flush": true, 00:19:26.150 "reset": true, 00:19:26.150 "nvme_admin": false, 00:19:26.150 "nvme_io": false, 00:19:26.150 "nvme_io_md": false, 00:19:26.150 "write_zeroes": true, 00:19:26.150 "zcopy": true, 00:19:26.150 "get_zone_info": false, 00:19:26.150 "zone_management": false, 00:19:26.150 "zone_append": false, 00:19:26.150 "compare": false, 00:19:26.150 "compare_and_write": false, 00:19:26.150 "abort": true, 00:19:26.150 "seek_hole": false, 00:19:26.150 "seek_data": false, 00:19:26.150 "copy": true, 00:19:26.150 "nvme_iov_md": false 00:19:26.150 }, 00:19:26.150 "memory_domains": [ 00:19:26.150 { 00:19:26.150 "dma_device_id": "system", 00:19:26.150 "dma_device_type": 1 00:19:26.150 }, 00:19:26.150 { 00:19:26.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.150 "dma_device_type": 2 00:19:26.150 } 00:19:26.150 ], 00:19:26.150 "driver_specific": {} 00:19:26.150 }' 00:19:26.150 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.150 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:26.407 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:26.664 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:26.664 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:26.664 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:26.664 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:26.664 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:26.664 "name": "BaseBdev4", 00:19:26.664 "aliases": [ 00:19:26.664 "9fe27587-4b52-485c-8760-e22548e577f3" 00:19:26.664 ], 00:19:26.664 "product_name": "Malloc disk", 00:19:26.664 "block_size": 512, 00:19:26.664 "num_blocks": 65536, 00:19:26.664 "uuid": "9fe27587-4b52-485c-8760-e22548e577f3", 00:19:26.664 "assigned_rate_limits": { 00:19:26.664 "rw_ios_per_sec": 0, 00:19:26.664 "rw_mbytes_per_sec": 0, 00:19:26.664 "r_mbytes_per_sec": 0, 00:19:26.664 "w_mbytes_per_sec": 0 00:19:26.664 }, 00:19:26.664 "claimed": true, 00:19:26.664 "claim_type": "exclusive_write", 00:19:26.664 "zoned": false, 00:19:26.664 "supported_io_types": { 00:19:26.664 "read": true, 00:19:26.664 "write": true, 00:19:26.664 "unmap": true, 00:19:26.664 "flush": true, 00:19:26.664 "reset": true, 00:19:26.664 "nvme_admin": false, 00:19:26.664 "nvme_io": false, 00:19:26.664 "nvme_io_md": false, 00:19:26.664 "write_zeroes": true, 00:19:26.664 "zcopy": true, 00:19:26.664 "get_zone_info": false, 00:19:26.664 "zone_management": false, 00:19:26.664 "zone_append": false, 00:19:26.664 "compare": false, 00:19:26.664 "compare_and_write": false, 00:19:26.664 "abort": true, 00:19:26.664 "seek_hole": false, 00:19:26.664 "seek_data": false, 00:19:26.664 "copy": true, 00:19:26.664 "nvme_iov_md": false 00:19:26.664 }, 00:19:26.664 "memory_domains": [ 00:19:26.664 { 00:19:26.664 "dma_device_id": "system", 00:19:26.664 "dma_device_type": 1 00:19:26.664 }, 00:19:26.664 { 00:19:26.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.664 "dma_device_type": 2 00:19:26.664 } 00:19:26.664 ], 00:19:26.664 "driver_specific": {} 00:19:26.664 }' 00:19:26.664 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:26.922 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.179 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.179 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:27.179 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:27.438 [2024-07-15 13:38:06.646777] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:27.438 [2024-07-15 13:38:06.646810] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:27.438 [2024-07-15 13:38:06.646869] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.438 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.696 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.696 "name": "Existed_Raid", 00:19:27.696 "uuid": "0ced30db-71f6-4aa8-9339-1c0832c9da57", 00:19:27.696 "strip_size_kb": 64, 00:19:27.696 "state": "offline", 00:19:27.696 "raid_level": "concat", 00:19:27.696 "superblock": true, 00:19:27.696 "num_base_bdevs": 4, 00:19:27.696 "num_base_bdevs_discovered": 3, 00:19:27.696 "num_base_bdevs_operational": 3, 00:19:27.696 "base_bdevs_list": [ 00:19:27.696 { 00:19:27.696 "name": null, 00:19:27.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.696 "is_configured": false, 00:19:27.696 "data_offset": 2048, 00:19:27.696 "data_size": 63488 00:19:27.696 }, 00:19:27.696 { 00:19:27.696 "name": "BaseBdev2", 00:19:27.696 "uuid": "307a1eba-074e-4b9b-90a4-d498dc9d5f8f", 00:19:27.696 "is_configured": true, 00:19:27.696 "data_offset": 2048, 00:19:27.696 "data_size": 63488 00:19:27.696 }, 00:19:27.696 { 00:19:27.696 "name": "BaseBdev3", 00:19:27.696 "uuid": "94e13615-de99-4e85-aff5-4e32f93db6a0", 00:19:27.696 "is_configured": true, 00:19:27.696 "data_offset": 2048, 00:19:27.696 "data_size": 63488 00:19:27.696 }, 00:19:27.696 { 00:19:27.696 "name": "BaseBdev4", 00:19:27.696 "uuid": "9fe27587-4b52-485c-8760-e22548e577f3", 00:19:27.696 "is_configured": true, 00:19:27.696 "data_offset": 2048, 00:19:27.696 "data_size": 63488 00:19:27.696 } 00:19:27.696 ] 00:19:27.696 }' 00:19:27.696 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.696 13:38:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.262 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:28.262 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:28.262 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.262 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:28.520 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:28.520 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:28.520 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:29.085 [2024-07-15 13:38:08.236946] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:29.085 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:29.085 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:29.085 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.085 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:29.342 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:29.342 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:29.342 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:29.600 [2024-07-15 13:38:09.003446] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:29.858 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:29.858 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:29.858 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.858 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:30.116 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:30.116 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:30.116 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:30.373 [2024-07-15 13:38:09.768070] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:30.373 [2024-07-15 13:38:09.768119] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d03350 name Existed_Raid, state offline 00:19:30.631 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:30.631 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:30.631 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.631 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:30.890 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:30.890 BaseBdev2 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:31.148 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:31.406 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:31.665 [ 00:19:31.665 { 00:19:31.665 "name": "BaseBdev2", 00:19:31.665 "aliases": [ 00:19:31.665 "17bcc803-587a-4c45-9135-7c5492893fea" 00:19:31.665 ], 00:19:31.665 "product_name": "Malloc disk", 00:19:31.665 "block_size": 512, 00:19:31.665 "num_blocks": 65536, 00:19:31.665 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:31.665 "assigned_rate_limits": { 00:19:31.665 "rw_ios_per_sec": 0, 00:19:31.665 "rw_mbytes_per_sec": 0, 00:19:31.665 "r_mbytes_per_sec": 0, 00:19:31.665 "w_mbytes_per_sec": 0 00:19:31.665 }, 00:19:31.665 "claimed": false, 00:19:31.665 "zoned": false, 00:19:31.665 "supported_io_types": { 00:19:31.665 "read": true, 00:19:31.665 "write": true, 00:19:31.665 "unmap": true, 00:19:31.665 "flush": true, 00:19:31.665 "reset": true, 00:19:31.665 "nvme_admin": false, 00:19:31.665 "nvme_io": false, 00:19:31.665 "nvme_io_md": false, 00:19:31.665 "write_zeroes": true, 00:19:31.665 "zcopy": true, 00:19:31.665 "get_zone_info": false, 00:19:31.665 "zone_management": false, 00:19:31.665 "zone_append": false, 00:19:31.665 "compare": false, 00:19:31.665 "compare_and_write": false, 00:19:31.665 "abort": true, 00:19:31.665 "seek_hole": false, 00:19:31.665 "seek_data": false, 00:19:31.665 "copy": true, 00:19:31.665 "nvme_iov_md": false 00:19:31.665 }, 00:19:31.665 "memory_domains": [ 00:19:31.665 { 00:19:31.665 "dma_device_id": "system", 00:19:31.665 "dma_device_type": 1 00:19:31.665 }, 00:19:31.665 { 00:19:31.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.665 "dma_device_type": 2 00:19:31.665 } 00:19:31.665 ], 00:19:31.665 "driver_specific": {} 00:19:31.665 } 00:19:31.665 ] 00:19:31.665 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:31.665 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:31.665 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:31.665 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:32.232 BaseBdev3 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:32.232 13:38:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:32.797 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:33.054 [ 00:19:33.054 { 00:19:33.054 "name": "BaseBdev3", 00:19:33.054 "aliases": [ 00:19:33.054 "a456f5e3-789a-411f-bc3c-ad6b49649741" 00:19:33.054 ], 00:19:33.054 "product_name": "Malloc disk", 00:19:33.054 "block_size": 512, 00:19:33.054 "num_blocks": 65536, 00:19:33.054 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:33.054 "assigned_rate_limits": { 00:19:33.054 "rw_ios_per_sec": 0, 00:19:33.054 "rw_mbytes_per_sec": 0, 00:19:33.054 "r_mbytes_per_sec": 0, 00:19:33.054 "w_mbytes_per_sec": 0 00:19:33.054 }, 00:19:33.054 "claimed": false, 00:19:33.054 "zoned": false, 00:19:33.054 "supported_io_types": { 00:19:33.054 "read": true, 00:19:33.054 "write": true, 00:19:33.054 "unmap": true, 00:19:33.054 "flush": true, 00:19:33.054 "reset": true, 00:19:33.054 "nvme_admin": false, 00:19:33.054 "nvme_io": false, 00:19:33.054 "nvme_io_md": false, 00:19:33.054 "write_zeroes": true, 00:19:33.054 "zcopy": true, 00:19:33.054 "get_zone_info": false, 00:19:33.054 "zone_management": false, 00:19:33.054 "zone_append": false, 00:19:33.054 "compare": false, 00:19:33.054 "compare_and_write": false, 00:19:33.054 "abort": true, 00:19:33.054 "seek_hole": false, 00:19:33.054 "seek_data": false, 00:19:33.054 "copy": true, 00:19:33.054 "nvme_iov_md": false 00:19:33.054 }, 00:19:33.054 "memory_domains": [ 00:19:33.054 { 00:19:33.054 "dma_device_id": "system", 00:19:33.054 "dma_device_type": 1 00:19:33.054 }, 00:19:33.054 { 00:19:33.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.054 "dma_device_type": 2 00:19:33.054 } 00:19:33.054 ], 00:19:33.054 "driver_specific": {} 00:19:33.054 } 00:19:33.054 ] 00:19:33.054 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:33.054 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:33.054 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:33.054 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:33.619 BaseBdev4 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:33.619 13:38:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.875 13:38:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:34.439 [ 00:19:34.439 { 00:19:34.439 "name": "BaseBdev4", 00:19:34.439 "aliases": [ 00:19:34.439 "64bef806-7685-4bf7-9846-caef9fc2783a" 00:19:34.439 ], 00:19:34.439 "product_name": "Malloc disk", 00:19:34.439 "block_size": 512, 00:19:34.439 "num_blocks": 65536, 00:19:34.439 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:34.439 "assigned_rate_limits": { 00:19:34.439 "rw_ios_per_sec": 0, 00:19:34.439 "rw_mbytes_per_sec": 0, 00:19:34.439 "r_mbytes_per_sec": 0, 00:19:34.439 "w_mbytes_per_sec": 0 00:19:34.439 }, 00:19:34.439 "claimed": false, 00:19:34.439 "zoned": false, 00:19:34.439 "supported_io_types": { 00:19:34.439 "read": true, 00:19:34.439 "write": true, 00:19:34.439 "unmap": true, 00:19:34.439 "flush": true, 00:19:34.439 "reset": true, 00:19:34.439 "nvme_admin": false, 00:19:34.439 "nvme_io": false, 00:19:34.439 "nvme_io_md": false, 00:19:34.439 "write_zeroes": true, 00:19:34.439 "zcopy": true, 00:19:34.439 "get_zone_info": false, 00:19:34.439 "zone_management": false, 00:19:34.439 "zone_append": false, 00:19:34.439 "compare": false, 00:19:34.439 "compare_and_write": false, 00:19:34.439 "abort": true, 00:19:34.439 "seek_hole": false, 00:19:34.439 "seek_data": false, 00:19:34.439 "copy": true, 00:19:34.439 "nvme_iov_md": false 00:19:34.439 }, 00:19:34.439 "memory_domains": [ 00:19:34.439 { 00:19:34.439 "dma_device_id": "system", 00:19:34.439 "dma_device_type": 1 00:19:34.439 }, 00:19:34.439 { 00:19:34.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.439 "dma_device_type": 2 00:19:34.439 } 00:19:34.439 ], 00:19:34.439 "driver_specific": {} 00:19:34.439 } 00:19:34.439 ] 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:34.439 [2024-07-15 13:38:13.833618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:34.439 [2024-07-15 13:38:13.833666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:34.439 [2024-07-15 13:38:13.833688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:34.439 [2024-07-15 13:38:13.835085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:34.439 [2024-07-15 13:38:13.835131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.439 13:38:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.697 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.697 "name": "Existed_Raid", 00:19:34.697 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:34.697 "strip_size_kb": 64, 00:19:34.697 "state": "configuring", 00:19:34.697 "raid_level": "concat", 00:19:34.697 "superblock": true, 00:19:34.697 "num_base_bdevs": 4, 00:19:34.697 "num_base_bdevs_discovered": 3, 00:19:34.697 "num_base_bdevs_operational": 4, 00:19:34.697 "base_bdevs_list": [ 00:19:34.697 { 00:19:34.697 "name": "BaseBdev1", 00:19:34.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.697 "is_configured": false, 00:19:34.697 "data_offset": 0, 00:19:34.697 "data_size": 0 00:19:34.697 }, 00:19:34.697 { 00:19:34.697 "name": "BaseBdev2", 00:19:34.697 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:34.697 "is_configured": true, 00:19:34.697 "data_offset": 2048, 00:19:34.697 "data_size": 63488 00:19:34.697 }, 00:19:34.697 { 00:19:34.697 "name": "BaseBdev3", 00:19:34.697 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:34.697 "is_configured": true, 00:19:34.697 "data_offset": 2048, 00:19:34.697 "data_size": 63488 00:19:34.697 }, 00:19:34.697 { 00:19:34.697 "name": "BaseBdev4", 00:19:34.697 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:34.697 "is_configured": true, 00:19:34.697 "data_offset": 2048, 00:19:34.697 "data_size": 63488 00:19:34.697 } 00:19:34.697 ] 00:19:34.697 }' 00:19:34.697 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.697 13:38:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:35.262 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:35.519 [2024-07-15 13:38:14.896390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.519 13:38:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.777 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.777 "name": "Existed_Raid", 00:19:35.777 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:35.777 "strip_size_kb": 64, 00:19:35.777 "state": "configuring", 00:19:35.777 "raid_level": "concat", 00:19:35.777 "superblock": true, 00:19:35.777 "num_base_bdevs": 4, 00:19:35.777 "num_base_bdevs_discovered": 2, 00:19:35.777 "num_base_bdevs_operational": 4, 00:19:35.777 "base_bdevs_list": [ 00:19:35.777 { 00:19:35.777 "name": "BaseBdev1", 00:19:35.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.777 "is_configured": false, 00:19:35.777 "data_offset": 0, 00:19:35.777 "data_size": 0 00:19:35.777 }, 00:19:35.777 { 00:19:35.777 "name": null, 00:19:35.777 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:35.777 "is_configured": false, 00:19:35.777 "data_offset": 2048, 00:19:35.777 "data_size": 63488 00:19:35.777 }, 00:19:35.777 { 00:19:35.777 "name": "BaseBdev3", 00:19:35.777 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:35.777 "is_configured": true, 00:19:35.777 "data_offset": 2048, 00:19:35.777 "data_size": 63488 00:19:35.777 }, 00:19:35.777 { 00:19:35.777 "name": "BaseBdev4", 00:19:35.777 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:35.777 "is_configured": true, 00:19:35.777 "data_offset": 2048, 00:19:35.777 "data_size": 63488 00:19:35.777 } 00:19:35.777 ] 00:19:35.777 }' 00:19:35.777 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.777 13:38:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:36.341 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.341 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:36.599 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:36.599 13:38:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:36.856 [2024-07-15 13:38:16.212088] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:36.856 BaseBdev1 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:36.856 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:37.113 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:37.371 [ 00:19:37.371 { 00:19:37.371 "name": "BaseBdev1", 00:19:37.371 "aliases": [ 00:19:37.371 "fcab2602-5c79-4509-b09f-4612910cc195" 00:19:37.371 ], 00:19:37.371 "product_name": "Malloc disk", 00:19:37.371 "block_size": 512, 00:19:37.371 "num_blocks": 65536, 00:19:37.371 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:37.371 "assigned_rate_limits": { 00:19:37.371 "rw_ios_per_sec": 0, 00:19:37.371 "rw_mbytes_per_sec": 0, 00:19:37.371 "r_mbytes_per_sec": 0, 00:19:37.371 "w_mbytes_per_sec": 0 00:19:37.371 }, 00:19:37.371 "claimed": true, 00:19:37.371 "claim_type": "exclusive_write", 00:19:37.371 "zoned": false, 00:19:37.371 "supported_io_types": { 00:19:37.371 "read": true, 00:19:37.371 "write": true, 00:19:37.371 "unmap": true, 00:19:37.371 "flush": true, 00:19:37.371 "reset": true, 00:19:37.371 "nvme_admin": false, 00:19:37.371 "nvme_io": false, 00:19:37.371 "nvme_io_md": false, 00:19:37.371 "write_zeroes": true, 00:19:37.371 "zcopy": true, 00:19:37.371 "get_zone_info": false, 00:19:37.371 "zone_management": false, 00:19:37.371 "zone_append": false, 00:19:37.371 "compare": false, 00:19:37.371 "compare_and_write": false, 00:19:37.371 "abort": true, 00:19:37.371 "seek_hole": false, 00:19:37.371 "seek_data": false, 00:19:37.371 "copy": true, 00:19:37.371 "nvme_iov_md": false 00:19:37.371 }, 00:19:37.371 "memory_domains": [ 00:19:37.371 { 00:19:37.371 "dma_device_id": "system", 00:19:37.371 "dma_device_type": 1 00:19:37.371 }, 00:19:37.371 { 00:19:37.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.371 "dma_device_type": 2 00:19:37.371 } 00:19:37.371 ], 00:19:37.371 "driver_specific": {} 00:19:37.371 } 00:19:37.371 ] 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.371 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.629 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.629 "name": "Existed_Raid", 00:19:37.629 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:37.629 "strip_size_kb": 64, 00:19:37.629 "state": "configuring", 00:19:37.629 "raid_level": "concat", 00:19:37.629 "superblock": true, 00:19:37.629 "num_base_bdevs": 4, 00:19:37.629 "num_base_bdevs_discovered": 3, 00:19:37.629 "num_base_bdevs_operational": 4, 00:19:37.629 "base_bdevs_list": [ 00:19:37.629 { 00:19:37.629 "name": "BaseBdev1", 00:19:37.629 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:37.629 "is_configured": true, 00:19:37.629 "data_offset": 2048, 00:19:37.629 "data_size": 63488 00:19:37.629 }, 00:19:37.629 { 00:19:37.629 "name": null, 00:19:37.629 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:37.629 "is_configured": false, 00:19:37.629 "data_offset": 2048, 00:19:37.629 "data_size": 63488 00:19:37.629 }, 00:19:37.629 { 00:19:37.629 "name": "BaseBdev3", 00:19:37.629 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:37.629 "is_configured": true, 00:19:37.629 "data_offset": 2048, 00:19:37.629 "data_size": 63488 00:19:37.629 }, 00:19:37.629 { 00:19:37.629 "name": "BaseBdev4", 00:19:37.629 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:37.629 "is_configured": true, 00:19:37.629 "data_offset": 2048, 00:19:37.629 "data_size": 63488 00:19:37.629 } 00:19:37.629 ] 00:19:37.629 }' 00:19:37.629 13:38:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.629 13:38:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.205 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.205 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:38.462 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:38.462 13:38:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:38.719 [2024-07-15 13:38:18.045007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.719 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.977 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.977 "name": "Existed_Raid", 00:19:38.977 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:38.977 "strip_size_kb": 64, 00:19:38.977 "state": "configuring", 00:19:38.977 "raid_level": "concat", 00:19:38.977 "superblock": true, 00:19:38.977 "num_base_bdevs": 4, 00:19:38.977 "num_base_bdevs_discovered": 2, 00:19:38.977 "num_base_bdevs_operational": 4, 00:19:38.977 "base_bdevs_list": [ 00:19:38.977 { 00:19:38.977 "name": "BaseBdev1", 00:19:38.977 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:38.977 "is_configured": true, 00:19:38.977 "data_offset": 2048, 00:19:38.977 "data_size": 63488 00:19:38.977 }, 00:19:38.977 { 00:19:38.977 "name": null, 00:19:38.977 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:38.977 "is_configured": false, 00:19:38.977 "data_offset": 2048, 00:19:38.977 "data_size": 63488 00:19:38.977 }, 00:19:38.977 { 00:19:38.977 "name": null, 00:19:38.977 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:38.977 "is_configured": false, 00:19:38.977 "data_offset": 2048, 00:19:38.977 "data_size": 63488 00:19:38.977 }, 00:19:38.977 { 00:19:38.977 "name": "BaseBdev4", 00:19:38.977 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:38.977 "is_configured": true, 00:19:38.977 "data_offset": 2048, 00:19:38.977 "data_size": 63488 00:19:38.977 } 00:19:38.977 ] 00:19:38.977 }' 00:19:38.977 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.977 13:38:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:39.542 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.542 13:38:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:39.799 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:39.799 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:40.056 [2024-07-15 13:38:19.248208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.056 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:40.621 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.621 "name": "Existed_Raid", 00:19:40.621 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:40.621 "strip_size_kb": 64, 00:19:40.621 "state": "configuring", 00:19:40.621 "raid_level": "concat", 00:19:40.621 "superblock": true, 00:19:40.621 "num_base_bdevs": 4, 00:19:40.621 "num_base_bdevs_discovered": 3, 00:19:40.621 "num_base_bdevs_operational": 4, 00:19:40.621 "base_bdevs_list": [ 00:19:40.621 { 00:19:40.622 "name": "BaseBdev1", 00:19:40.622 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:40.622 "is_configured": true, 00:19:40.622 "data_offset": 2048, 00:19:40.622 "data_size": 63488 00:19:40.622 }, 00:19:40.622 { 00:19:40.622 "name": null, 00:19:40.622 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:40.622 "is_configured": false, 00:19:40.622 "data_offset": 2048, 00:19:40.622 "data_size": 63488 00:19:40.622 }, 00:19:40.622 { 00:19:40.622 "name": "BaseBdev3", 00:19:40.622 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:40.622 "is_configured": true, 00:19:40.622 "data_offset": 2048, 00:19:40.622 "data_size": 63488 00:19:40.622 }, 00:19:40.622 { 00:19:40.622 "name": "BaseBdev4", 00:19:40.622 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:40.622 "is_configured": true, 00:19:40.622 "data_offset": 2048, 00:19:40.622 "data_size": 63488 00:19:40.622 } 00:19:40.622 ] 00:19:40.622 }' 00:19:40.622 13:38:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.622 13:38:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:41.187 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:41.187 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.445 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:41.445 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:41.702 [2024-07-15 13:38:20.912632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.702 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.959 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.959 "name": "Existed_Raid", 00:19:41.959 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:41.959 "strip_size_kb": 64, 00:19:41.959 "state": "configuring", 00:19:41.959 "raid_level": "concat", 00:19:41.959 "superblock": true, 00:19:41.959 "num_base_bdevs": 4, 00:19:41.959 "num_base_bdevs_discovered": 2, 00:19:41.959 "num_base_bdevs_operational": 4, 00:19:41.959 "base_bdevs_list": [ 00:19:41.959 { 00:19:41.959 "name": null, 00:19:41.959 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:41.959 "is_configured": false, 00:19:41.959 "data_offset": 2048, 00:19:41.959 "data_size": 63488 00:19:41.959 }, 00:19:41.959 { 00:19:41.959 "name": null, 00:19:41.959 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:41.959 "is_configured": false, 00:19:41.959 "data_offset": 2048, 00:19:41.959 "data_size": 63488 00:19:41.959 }, 00:19:41.959 { 00:19:41.959 "name": "BaseBdev3", 00:19:41.959 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:41.959 "is_configured": true, 00:19:41.959 "data_offset": 2048, 00:19:41.959 "data_size": 63488 00:19:41.959 }, 00:19:41.959 { 00:19:41.959 "name": "BaseBdev4", 00:19:41.959 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:41.959 "is_configured": true, 00:19:41.959 "data_offset": 2048, 00:19:41.959 "data_size": 63488 00:19:41.960 } 00:19:41.960 ] 00:19:41.960 }' 00:19:41.960 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.960 13:38:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:42.524 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.524 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:42.524 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:42.524 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:42.781 [2024-07-15 13:38:22.124101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.781 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.039 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.039 "name": "Existed_Raid", 00:19:43.039 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:43.039 "strip_size_kb": 64, 00:19:43.039 "state": "configuring", 00:19:43.039 "raid_level": "concat", 00:19:43.039 "superblock": true, 00:19:43.039 "num_base_bdevs": 4, 00:19:43.039 "num_base_bdevs_discovered": 3, 00:19:43.039 "num_base_bdevs_operational": 4, 00:19:43.039 "base_bdevs_list": [ 00:19:43.039 { 00:19:43.039 "name": null, 00:19:43.039 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:43.039 "is_configured": false, 00:19:43.039 "data_offset": 2048, 00:19:43.039 "data_size": 63488 00:19:43.039 }, 00:19:43.039 { 00:19:43.039 "name": "BaseBdev2", 00:19:43.039 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:43.039 "is_configured": true, 00:19:43.039 "data_offset": 2048, 00:19:43.039 "data_size": 63488 00:19:43.039 }, 00:19:43.039 { 00:19:43.039 "name": "BaseBdev3", 00:19:43.039 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:43.039 "is_configured": true, 00:19:43.039 "data_offset": 2048, 00:19:43.039 "data_size": 63488 00:19:43.039 }, 00:19:43.039 { 00:19:43.039 "name": "BaseBdev4", 00:19:43.039 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:43.039 "is_configured": true, 00:19:43.039 "data_offset": 2048, 00:19:43.039 "data_size": 63488 00:19:43.039 } 00:19:43.039 ] 00:19:43.039 }' 00:19:43.039 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.039 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.603 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.603 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:43.860 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:43.860 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.860 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:44.117 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fcab2602-5c79-4509-b09f-4612910cc195 00:19:44.373 [2024-07-15 13:38:23.716946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:44.373 [2024-07-15 13:38:23.717124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d05850 00:19:44.373 [2024-07-15 13:38:23.717138] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:44.373 [2024-07-15 13:38:23.717321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfbd80 00:19:44.373 [2024-07-15 13:38:23.717440] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d05850 00:19:44.373 [2024-07-15 13:38:23.717450] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d05850 00:19:44.373 [2024-07-15 13:38:23.717542] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:44.373 NewBaseBdev 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:44.373 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:44.630 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:44.888 [ 00:19:44.888 { 00:19:44.888 "name": "NewBaseBdev", 00:19:44.888 "aliases": [ 00:19:44.888 "fcab2602-5c79-4509-b09f-4612910cc195" 00:19:44.888 ], 00:19:44.888 "product_name": "Malloc disk", 00:19:44.888 "block_size": 512, 00:19:44.888 "num_blocks": 65536, 00:19:44.888 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:44.888 "assigned_rate_limits": { 00:19:44.888 "rw_ios_per_sec": 0, 00:19:44.888 "rw_mbytes_per_sec": 0, 00:19:44.888 "r_mbytes_per_sec": 0, 00:19:44.888 "w_mbytes_per_sec": 0 00:19:44.888 }, 00:19:44.888 "claimed": true, 00:19:44.888 "claim_type": "exclusive_write", 00:19:44.888 "zoned": false, 00:19:44.888 "supported_io_types": { 00:19:44.888 "read": true, 00:19:44.888 "write": true, 00:19:44.888 "unmap": true, 00:19:44.888 "flush": true, 00:19:44.888 "reset": true, 00:19:44.888 "nvme_admin": false, 00:19:44.888 "nvme_io": false, 00:19:44.888 "nvme_io_md": false, 00:19:44.888 "write_zeroes": true, 00:19:44.888 "zcopy": true, 00:19:44.888 "get_zone_info": false, 00:19:44.888 "zone_management": false, 00:19:44.888 "zone_append": false, 00:19:44.888 "compare": false, 00:19:44.888 "compare_and_write": false, 00:19:44.888 "abort": true, 00:19:44.888 "seek_hole": false, 00:19:44.888 "seek_data": false, 00:19:44.888 "copy": true, 00:19:44.888 "nvme_iov_md": false 00:19:44.888 }, 00:19:44.888 "memory_domains": [ 00:19:44.888 { 00:19:44.888 "dma_device_id": "system", 00:19:44.888 "dma_device_type": 1 00:19:44.888 }, 00:19:44.888 { 00:19:44.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.888 "dma_device_type": 2 00:19:44.888 } 00:19:44.888 ], 00:19:44.888 "driver_specific": {} 00:19:44.888 } 00:19:44.888 ] 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.888 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:45.145 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.145 "name": "Existed_Raid", 00:19:45.145 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:45.145 "strip_size_kb": 64, 00:19:45.145 "state": "online", 00:19:45.145 "raid_level": "concat", 00:19:45.145 "superblock": true, 00:19:45.145 "num_base_bdevs": 4, 00:19:45.145 "num_base_bdevs_discovered": 4, 00:19:45.145 "num_base_bdevs_operational": 4, 00:19:45.145 "base_bdevs_list": [ 00:19:45.145 { 00:19:45.145 "name": "NewBaseBdev", 00:19:45.145 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:45.145 "is_configured": true, 00:19:45.145 "data_offset": 2048, 00:19:45.145 "data_size": 63488 00:19:45.145 }, 00:19:45.145 { 00:19:45.145 "name": "BaseBdev2", 00:19:45.145 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:45.145 "is_configured": true, 00:19:45.145 "data_offset": 2048, 00:19:45.145 "data_size": 63488 00:19:45.145 }, 00:19:45.145 { 00:19:45.145 "name": "BaseBdev3", 00:19:45.145 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:45.145 "is_configured": true, 00:19:45.145 "data_offset": 2048, 00:19:45.145 "data_size": 63488 00:19:45.145 }, 00:19:45.145 { 00:19:45.145 "name": "BaseBdev4", 00:19:45.145 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:45.145 "is_configured": true, 00:19:45.145 "data_offset": 2048, 00:19:45.145 "data_size": 63488 00:19:45.145 } 00:19:45.145 ] 00:19:45.145 }' 00:19:45.145 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.145 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:45.710 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:45.966 [2024-07-15 13:38:25.281428] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.966 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:45.966 "name": "Existed_Raid", 00:19:45.966 "aliases": [ 00:19:45.966 "309e56b3-5b9d-4ec1-993d-0fcc97491f33" 00:19:45.966 ], 00:19:45.966 "product_name": "Raid Volume", 00:19:45.966 "block_size": 512, 00:19:45.966 "num_blocks": 253952, 00:19:45.966 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:45.966 "assigned_rate_limits": { 00:19:45.966 "rw_ios_per_sec": 0, 00:19:45.966 "rw_mbytes_per_sec": 0, 00:19:45.966 "r_mbytes_per_sec": 0, 00:19:45.966 "w_mbytes_per_sec": 0 00:19:45.966 }, 00:19:45.966 "claimed": false, 00:19:45.966 "zoned": false, 00:19:45.966 "supported_io_types": { 00:19:45.966 "read": true, 00:19:45.966 "write": true, 00:19:45.966 "unmap": true, 00:19:45.966 "flush": true, 00:19:45.966 "reset": true, 00:19:45.966 "nvme_admin": false, 00:19:45.966 "nvme_io": false, 00:19:45.966 "nvme_io_md": false, 00:19:45.966 "write_zeroes": true, 00:19:45.966 "zcopy": false, 00:19:45.966 "get_zone_info": false, 00:19:45.966 "zone_management": false, 00:19:45.966 "zone_append": false, 00:19:45.966 "compare": false, 00:19:45.966 "compare_and_write": false, 00:19:45.966 "abort": false, 00:19:45.966 "seek_hole": false, 00:19:45.966 "seek_data": false, 00:19:45.966 "copy": false, 00:19:45.966 "nvme_iov_md": false 00:19:45.966 }, 00:19:45.966 "memory_domains": [ 00:19:45.966 { 00:19:45.966 "dma_device_id": "system", 00:19:45.966 "dma_device_type": 1 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.966 "dma_device_type": 2 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "system", 00:19:45.966 "dma_device_type": 1 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.966 "dma_device_type": 2 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "system", 00:19:45.966 "dma_device_type": 1 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.966 "dma_device_type": 2 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "system", 00:19:45.966 "dma_device_type": 1 00:19:45.966 }, 00:19:45.966 { 00:19:45.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.966 "dma_device_type": 2 00:19:45.966 } 00:19:45.966 ], 00:19:45.966 "driver_specific": { 00:19:45.966 "raid": { 00:19:45.966 "uuid": "309e56b3-5b9d-4ec1-993d-0fcc97491f33", 00:19:45.966 "strip_size_kb": 64, 00:19:45.966 "state": "online", 00:19:45.966 "raid_level": "concat", 00:19:45.966 "superblock": true, 00:19:45.966 "num_base_bdevs": 4, 00:19:45.966 "num_base_bdevs_discovered": 4, 00:19:45.966 "num_base_bdevs_operational": 4, 00:19:45.966 "base_bdevs_list": [ 00:19:45.966 { 00:19:45.966 "name": "NewBaseBdev", 00:19:45.966 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:45.966 "is_configured": true, 00:19:45.966 "data_offset": 2048, 00:19:45.966 "data_size": 63488 00:19:45.967 }, 00:19:45.967 { 00:19:45.967 "name": "BaseBdev2", 00:19:45.967 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:45.967 "is_configured": true, 00:19:45.967 "data_offset": 2048, 00:19:45.967 "data_size": 63488 00:19:45.967 }, 00:19:45.967 { 00:19:45.967 "name": "BaseBdev3", 00:19:45.967 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:45.967 "is_configured": true, 00:19:45.967 "data_offset": 2048, 00:19:45.967 "data_size": 63488 00:19:45.967 }, 00:19:45.967 { 00:19:45.967 "name": "BaseBdev4", 00:19:45.967 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:45.967 "is_configured": true, 00:19:45.967 "data_offset": 2048, 00:19:45.967 "data_size": 63488 00:19:45.967 } 00:19:45.967 ] 00:19:45.967 } 00:19:45.967 } 00:19:45.967 }' 00:19:45.967 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:45.967 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:45.967 BaseBdev2 00:19:45.967 BaseBdev3 00:19:45.967 BaseBdev4' 00:19:45.967 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:45.967 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:45.967 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:46.223 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:46.223 "name": "NewBaseBdev", 00:19:46.223 "aliases": [ 00:19:46.223 "fcab2602-5c79-4509-b09f-4612910cc195" 00:19:46.223 ], 00:19:46.223 "product_name": "Malloc disk", 00:19:46.223 "block_size": 512, 00:19:46.223 "num_blocks": 65536, 00:19:46.223 "uuid": "fcab2602-5c79-4509-b09f-4612910cc195", 00:19:46.223 "assigned_rate_limits": { 00:19:46.223 "rw_ios_per_sec": 0, 00:19:46.223 "rw_mbytes_per_sec": 0, 00:19:46.223 "r_mbytes_per_sec": 0, 00:19:46.223 "w_mbytes_per_sec": 0 00:19:46.223 }, 00:19:46.223 "claimed": true, 00:19:46.223 "claim_type": "exclusive_write", 00:19:46.223 "zoned": false, 00:19:46.223 "supported_io_types": { 00:19:46.223 "read": true, 00:19:46.223 "write": true, 00:19:46.223 "unmap": true, 00:19:46.223 "flush": true, 00:19:46.223 "reset": true, 00:19:46.223 "nvme_admin": false, 00:19:46.223 "nvme_io": false, 00:19:46.223 "nvme_io_md": false, 00:19:46.223 "write_zeroes": true, 00:19:46.223 "zcopy": true, 00:19:46.223 "get_zone_info": false, 00:19:46.223 "zone_management": false, 00:19:46.223 "zone_append": false, 00:19:46.223 "compare": false, 00:19:46.223 "compare_and_write": false, 00:19:46.223 "abort": true, 00:19:46.223 "seek_hole": false, 00:19:46.223 "seek_data": false, 00:19:46.223 "copy": true, 00:19:46.223 "nvme_iov_md": false 00:19:46.223 }, 00:19:46.223 "memory_domains": [ 00:19:46.223 { 00:19:46.223 "dma_device_id": "system", 00:19:46.223 "dma_device_type": 1 00:19:46.223 }, 00:19:46.223 { 00:19:46.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.223 "dma_device_type": 2 00:19:46.223 } 00:19:46.223 ], 00:19:46.223 "driver_specific": {} 00:19:46.223 }' 00:19:46.223 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.223 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.479 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.735 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.735 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.735 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:46.735 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:46.991 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:46.991 "name": "BaseBdev2", 00:19:46.991 "aliases": [ 00:19:46.991 "17bcc803-587a-4c45-9135-7c5492893fea" 00:19:46.991 ], 00:19:46.991 "product_name": "Malloc disk", 00:19:46.991 "block_size": 512, 00:19:46.991 "num_blocks": 65536, 00:19:46.991 "uuid": "17bcc803-587a-4c45-9135-7c5492893fea", 00:19:46.991 "assigned_rate_limits": { 00:19:46.991 "rw_ios_per_sec": 0, 00:19:46.991 "rw_mbytes_per_sec": 0, 00:19:46.991 "r_mbytes_per_sec": 0, 00:19:46.991 "w_mbytes_per_sec": 0 00:19:46.991 }, 00:19:46.991 "claimed": true, 00:19:46.991 "claim_type": "exclusive_write", 00:19:46.991 "zoned": false, 00:19:46.991 "supported_io_types": { 00:19:46.991 "read": true, 00:19:46.991 "write": true, 00:19:46.991 "unmap": true, 00:19:46.991 "flush": true, 00:19:46.991 "reset": true, 00:19:46.991 "nvme_admin": false, 00:19:46.991 "nvme_io": false, 00:19:46.991 "nvme_io_md": false, 00:19:46.991 "write_zeroes": true, 00:19:46.991 "zcopy": true, 00:19:46.991 "get_zone_info": false, 00:19:46.991 "zone_management": false, 00:19:46.991 "zone_append": false, 00:19:46.991 "compare": false, 00:19:46.991 "compare_and_write": false, 00:19:46.991 "abort": true, 00:19:46.991 "seek_hole": false, 00:19:46.991 "seek_data": false, 00:19:46.991 "copy": true, 00:19:46.991 "nvme_iov_md": false 00:19:46.991 }, 00:19:46.991 "memory_domains": [ 00:19:46.991 { 00:19:46.991 "dma_device_id": "system", 00:19:46.991 "dma_device_type": 1 00:19:46.991 }, 00:19:46.991 { 00:19:46.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.991 "dma_device_type": 2 00:19:46.991 } 00:19:46.991 ], 00:19:46.991 "driver_specific": {} 00:19:46.991 }' 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.992 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:47.247 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.504 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.504 "name": "BaseBdev3", 00:19:47.504 "aliases": [ 00:19:47.504 "a456f5e3-789a-411f-bc3c-ad6b49649741" 00:19:47.504 ], 00:19:47.504 "product_name": "Malloc disk", 00:19:47.504 "block_size": 512, 00:19:47.504 "num_blocks": 65536, 00:19:47.504 "uuid": "a456f5e3-789a-411f-bc3c-ad6b49649741", 00:19:47.504 "assigned_rate_limits": { 00:19:47.504 "rw_ios_per_sec": 0, 00:19:47.504 "rw_mbytes_per_sec": 0, 00:19:47.504 "r_mbytes_per_sec": 0, 00:19:47.504 "w_mbytes_per_sec": 0 00:19:47.504 }, 00:19:47.504 "claimed": true, 00:19:47.504 "claim_type": "exclusive_write", 00:19:47.504 "zoned": false, 00:19:47.504 "supported_io_types": { 00:19:47.504 "read": true, 00:19:47.504 "write": true, 00:19:47.504 "unmap": true, 00:19:47.504 "flush": true, 00:19:47.504 "reset": true, 00:19:47.504 "nvme_admin": false, 00:19:47.504 "nvme_io": false, 00:19:47.504 "nvme_io_md": false, 00:19:47.504 "write_zeroes": true, 00:19:47.504 "zcopy": true, 00:19:47.504 "get_zone_info": false, 00:19:47.504 "zone_management": false, 00:19:47.504 "zone_append": false, 00:19:47.504 "compare": false, 00:19:47.504 "compare_and_write": false, 00:19:47.504 "abort": true, 00:19:47.504 "seek_hole": false, 00:19:47.504 "seek_data": false, 00:19:47.504 "copy": true, 00:19:47.504 "nvme_iov_md": false 00:19:47.504 }, 00:19:47.504 "memory_domains": [ 00:19:47.504 { 00:19:47.504 "dma_device_id": "system", 00:19:47.504 "dma_device_type": 1 00:19:47.504 }, 00:19:47.504 { 00:19:47.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.504 "dma_device_type": 2 00:19:47.504 } 00:19:47.504 ], 00:19:47.504 "driver_specific": {} 00:19:47.504 }' 00:19:47.504 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.504 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.760 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.760 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.760 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.760 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.015 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.015 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.016 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:48.016 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:48.271 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:48.271 "name": "BaseBdev4", 00:19:48.271 "aliases": [ 00:19:48.271 "64bef806-7685-4bf7-9846-caef9fc2783a" 00:19:48.271 ], 00:19:48.271 "product_name": "Malloc disk", 00:19:48.271 "block_size": 512, 00:19:48.271 "num_blocks": 65536, 00:19:48.271 "uuid": "64bef806-7685-4bf7-9846-caef9fc2783a", 00:19:48.271 "assigned_rate_limits": { 00:19:48.271 "rw_ios_per_sec": 0, 00:19:48.271 "rw_mbytes_per_sec": 0, 00:19:48.271 "r_mbytes_per_sec": 0, 00:19:48.271 "w_mbytes_per_sec": 0 00:19:48.271 }, 00:19:48.271 "claimed": true, 00:19:48.271 "claim_type": "exclusive_write", 00:19:48.271 "zoned": false, 00:19:48.271 "supported_io_types": { 00:19:48.271 "read": true, 00:19:48.271 "write": true, 00:19:48.271 "unmap": true, 00:19:48.271 "flush": true, 00:19:48.271 "reset": true, 00:19:48.271 "nvme_admin": false, 00:19:48.271 "nvme_io": false, 00:19:48.271 "nvme_io_md": false, 00:19:48.271 "write_zeroes": true, 00:19:48.271 "zcopy": true, 00:19:48.271 "get_zone_info": false, 00:19:48.271 "zone_management": false, 00:19:48.271 "zone_append": false, 00:19:48.271 "compare": false, 00:19:48.271 "compare_and_write": false, 00:19:48.271 "abort": true, 00:19:48.271 "seek_hole": false, 00:19:48.271 "seek_data": false, 00:19:48.271 "copy": true, 00:19:48.271 "nvme_iov_md": false 00:19:48.271 }, 00:19:48.271 "memory_domains": [ 00:19:48.271 { 00:19:48.271 "dma_device_id": "system", 00:19:48.271 "dma_device_type": 1 00:19:48.271 }, 00:19:48.271 { 00:19:48.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.272 "dma_device_type": 2 00:19:48.272 } 00:19:48.272 ], 00:19:48.272 "driver_specific": {} 00:19:48.272 }' 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.272 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.527 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:48.528 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.528 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.528 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.528 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:48.783 [2024-07-15 13:38:28.016363] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:48.783 [2024-07-15 13:38:28.016391] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:48.783 [2024-07-15 13:38:28.016444] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:48.783 [2024-07-15 13:38:28.016508] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:48.783 [2024-07-15 13:38:28.016520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d05850 name Existed_Raid, state offline 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2147903 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2147903 ']' 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2147903 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2147903 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2147903' 00:19:48.783 killing process with pid 2147903 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2147903 00:19:48.783 [2024-07-15 13:38:28.072553] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:48.783 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2147903 00:19:48.783 [2024-07-15 13:38:28.108438] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:49.040 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:49.040 00:19:49.040 real 0m34.454s 00:19:49.040 user 1m3.341s 00:19:49.040 sys 0m6.014s 00:19:49.040 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:49.040 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.040 ************************************ 00:19:49.040 END TEST raid_state_function_test_sb 00:19:49.040 ************************************ 00:19:49.040 13:38:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:49.040 13:38:28 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:19:49.040 13:38:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:49.040 13:38:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:49.040 13:38:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:49.040 ************************************ 00:19:49.040 START TEST raid_superblock_test 00:19:49.040 ************************************ 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2152962 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2152962 /var/tmp/spdk-raid.sock 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2152962 ']' 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:49.040 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:49.040 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.040 [2024-07-15 13:38:28.447242] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:19:49.040 [2024-07-15 13:38:28.447304] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2152962 ] 00:19:49.297 [2024-07-15 13:38:28.565180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:49.297 [2024-07-15 13:38:28.668278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:49.554 [2024-07-15 13:38:28.730802] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:49.554 [2024-07-15 13:38:28.730837] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:50.118 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:50.375 malloc1 00:19:50.375 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:50.632 [2024-07-15 13:38:29.825092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:50.632 [2024-07-15 13:38:29.825140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.632 [2024-07-15 13:38:29.825162] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a7f570 00:19:50.632 [2024-07-15 13:38:29.825175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.632 [2024-07-15 13:38:29.826840] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.632 [2024-07-15 13:38:29.826868] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:50.632 pt1 00:19:50.632 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:50.632 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:50.632 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:50.633 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:51.197 malloc2 00:19:51.197 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:51.454 [2024-07-15 13:38:30.840527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:51.454 [2024-07-15 13:38:30.840576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.454 [2024-07-15 13:38:30.840594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a80970 00:19:51.454 [2024-07-15 13:38:30.840606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.454 [2024-07-15 13:38:30.842271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.454 [2024-07-15 13:38:30.842303] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:51.454 pt2 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:51.454 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:51.709 malloc3 00:19:51.709 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:51.990 [2024-07-15 13:38:31.327647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:51.990 [2024-07-15 13:38:31.327695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.990 [2024-07-15 13:38:31.327712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c17340 00:19:51.990 [2024-07-15 13:38:31.327725] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.990 [2024-07-15 13:38:31.329275] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.990 [2024-07-15 13:38:31.329304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:51.990 pt3 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:51.990 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:52.258 malloc4 00:19:52.258 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:52.514 [2024-07-15 13:38:31.806974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:52.514 [2024-07-15 13:38:31.807020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.514 [2024-07-15 13:38:31.807041] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c19c60 00:19:52.514 [2024-07-15 13:38:31.807053] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.514 [2024-07-15 13:38:31.808590] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.514 [2024-07-15 13:38:31.808617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:52.514 pt4 00:19:52.514 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:52.514 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:52.514 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:52.771 [2024-07-15 13:38:32.039605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:52.771 [2024-07-15 13:38:32.040897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:52.771 [2024-07-15 13:38:32.040961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:52.771 [2024-07-15 13:38:32.041007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:52.771 [2024-07-15 13:38:32.041173] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a77530 00:19:52.771 [2024-07-15 13:38:32.041186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:52.771 [2024-07-15 13:38:32.041382] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a75770 00:19:52.771 [2024-07-15 13:38:32.041528] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a77530 00:19:52.771 [2024-07-15 13:38:32.041539] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a77530 00:19:52.771 [2024-07-15 13:38:32.041635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:52.771 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:52.771 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.772 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.029 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.029 "name": "raid_bdev1", 00:19:53.029 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:19:53.029 "strip_size_kb": 64, 00:19:53.029 "state": "online", 00:19:53.029 "raid_level": "concat", 00:19:53.029 "superblock": true, 00:19:53.029 "num_base_bdevs": 4, 00:19:53.029 "num_base_bdevs_discovered": 4, 00:19:53.029 "num_base_bdevs_operational": 4, 00:19:53.029 "base_bdevs_list": [ 00:19:53.029 { 00:19:53.029 "name": "pt1", 00:19:53.029 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:53.029 "is_configured": true, 00:19:53.029 "data_offset": 2048, 00:19:53.029 "data_size": 63488 00:19:53.029 }, 00:19:53.029 { 00:19:53.029 "name": "pt2", 00:19:53.029 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.029 "is_configured": true, 00:19:53.029 "data_offset": 2048, 00:19:53.029 "data_size": 63488 00:19:53.029 }, 00:19:53.029 { 00:19:53.029 "name": "pt3", 00:19:53.029 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.029 "is_configured": true, 00:19:53.029 "data_offset": 2048, 00:19:53.029 "data_size": 63488 00:19:53.029 }, 00:19:53.029 { 00:19:53.029 "name": "pt4", 00:19:53.029 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:53.029 "is_configured": true, 00:19:53.029 "data_offset": 2048, 00:19:53.029 "data_size": 63488 00:19:53.029 } 00:19:53.029 ] 00:19:53.029 }' 00:19:53.029 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.029 13:38:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:53.593 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:53.851 [2024-07-15 13:38:33.102683] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:53.851 "name": "raid_bdev1", 00:19:53.851 "aliases": [ 00:19:53.851 "2df2724c-395f-46b0-9754-c83b7d1df818" 00:19:53.851 ], 00:19:53.851 "product_name": "Raid Volume", 00:19:53.851 "block_size": 512, 00:19:53.851 "num_blocks": 253952, 00:19:53.851 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:19:53.851 "assigned_rate_limits": { 00:19:53.851 "rw_ios_per_sec": 0, 00:19:53.851 "rw_mbytes_per_sec": 0, 00:19:53.851 "r_mbytes_per_sec": 0, 00:19:53.851 "w_mbytes_per_sec": 0 00:19:53.851 }, 00:19:53.851 "claimed": false, 00:19:53.851 "zoned": false, 00:19:53.851 "supported_io_types": { 00:19:53.851 "read": true, 00:19:53.851 "write": true, 00:19:53.851 "unmap": true, 00:19:53.851 "flush": true, 00:19:53.851 "reset": true, 00:19:53.851 "nvme_admin": false, 00:19:53.851 "nvme_io": false, 00:19:53.851 "nvme_io_md": false, 00:19:53.851 "write_zeroes": true, 00:19:53.851 "zcopy": false, 00:19:53.851 "get_zone_info": false, 00:19:53.851 "zone_management": false, 00:19:53.851 "zone_append": false, 00:19:53.851 "compare": false, 00:19:53.851 "compare_and_write": false, 00:19:53.851 "abort": false, 00:19:53.851 "seek_hole": false, 00:19:53.851 "seek_data": false, 00:19:53.851 "copy": false, 00:19:53.851 "nvme_iov_md": false 00:19:53.851 }, 00:19:53.851 "memory_domains": [ 00:19:53.851 { 00:19:53.851 "dma_device_id": "system", 00:19:53.851 "dma_device_type": 1 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.851 "dma_device_type": 2 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "system", 00:19:53.851 "dma_device_type": 1 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.851 "dma_device_type": 2 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "system", 00:19:53.851 "dma_device_type": 1 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.851 "dma_device_type": 2 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "system", 00:19:53.851 "dma_device_type": 1 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.851 "dma_device_type": 2 00:19:53.851 } 00:19:53.851 ], 00:19:53.851 "driver_specific": { 00:19:53.851 "raid": { 00:19:53.851 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:19:53.851 "strip_size_kb": 64, 00:19:53.851 "state": "online", 00:19:53.851 "raid_level": "concat", 00:19:53.851 "superblock": true, 00:19:53.851 "num_base_bdevs": 4, 00:19:53.851 "num_base_bdevs_discovered": 4, 00:19:53.851 "num_base_bdevs_operational": 4, 00:19:53.851 "base_bdevs_list": [ 00:19:53.851 { 00:19:53.851 "name": "pt1", 00:19:53.851 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:53.851 "is_configured": true, 00:19:53.851 "data_offset": 2048, 00:19:53.851 "data_size": 63488 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "name": "pt2", 00:19:53.851 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:53.851 "is_configured": true, 00:19:53.851 "data_offset": 2048, 00:19:53.851 "data_size": 63488 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "name": "pt3", 00:19:53.851 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.851 "is_configured": true, 00:19:53.851 "data_offset": 2048, 00:19:53.851 "data_size": 63488 00:19:53.851 }, 00:19:53.851 { 00:19:53.851 "name": "pt4", 00:19:53.851 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:53.851 "is_configured": true, 00:19:53.851 "data_offset": 2048, 00:19:53.851 "data_size": 63488 00:19:53.851 } 00:19:53.851 ] 00:19:53.851 } 00:19:53.851 } 00:19:53.851 }' 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:53.851 pt2 00:19:53.851 pt3 00:19:53.851 pt4' 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:53.851 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.109 "name": "pt1", 00:19:54.109 "aliases": [ 00:19:54.109 "00000000-0000-0000-0000-000000000001" 00:19:54.109 ], 00:19:54.109 "product_name": "passthru", 00:19:54.109 "block_size": 512, 00:19:54.109 "num_blocks": 65536, 00:19:54.109 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:54.109 "assigned_rate_limits": { 00:19:54.109 "rw_ios_per_sec": 0, 00:19:54.109 "rw_mbytes_per_sec": 0, 00:19:54.109 "r_mbytes_per_sec": 0, 00:19:54.109 "w_mbytes_per_sec": 0 00:19:54.109 }, 00:19:54.109 "claimed": true, 00:19:54.109 "claim_type": "exclusive_write", 00:19:54.109 "zoned": false, 00:19:54.109 "supported_io_types": { 00:19:54.109 "read": true, 00:19:54.109 "write": true, 00:19:54.109 "unmap": true, 00:19:54.109 "flush": true, 00:19:54.109 "reset": true, 00:19:54.109 "nvme_admin": false, 00:19:54.109 "nvme_io": false, 00:19:54.109 "nvme_io_md": false, 00:19:54.109 "write_zeroes": true, 00:19:54.109 "zcopy": true, 00:19:54.109 "get_zone_info": false, 00:19:54.109 "zone_management": false, 00:19:54.109 "zone_append": false, 00:19:54.109 "compare": false, 00:19:54.109 "compare_and_write": false, 00:19:54.109 "abort": true, 00:19:54.109 "seek_hole": false, 00:19:54.109 "seek_data": false, 00:19:54.109 "copy": true, 00:19:54.109 "nvme_iov_md": false 00:19:54.109 }, 00:19:54.109 "memory_domains": [ 00:19:54.109 { 00:19:54.109 "dma_device_id": "system", 00:19:54.109 "dma_device_type": 1 00:19:54.109 }, 00:19:54.109 { 00:19:54.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.109 "dma_device_type": 2 00:19:54.109 } 00:19:54.109 ], 00:19:54.109 "driver_specific": { 00:19:54.109 "passthru": { 00:19:54.109 "name": "pt1", 00:19:54.109 "base_bdev_name": "malloc1" 00:19:54.109 } 00:19:54.109 } 00:19:54.109 }' 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.109 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:54.366 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.622 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.622 "name": "pt2", 00:19:54.622 "aliases": [ 00:19:54.622 "00000000-0000-0000-0000-000000000002" 00:19:54.622 ], 00:19:54.622 "product_name": "passthru", 00:19:54.622 "block_size": 512, 00:19:54.622 "num_blocks": 65536, 00:19:54.622 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:54.622 "assigned_rate_limits": { 00:19:54.622 "rw_ios_per_sec": 0, 00:19:54.622 "rw_mbytes_per_sec": 0, 00:19:54.622 "r_mbytes_per_sec": 0, 00:19:54.622 "w_mbytes_per_sec": 0 00:19:54.622 }, 00:19:54.622 "claimed": true, 00:19:54.622 "claim_type": "exclusive_write", 00:19:54.622 "zoned": false, 00:19:54.622 "supported_io_types": { 00:19:54.622 "read": true, 00:19:54.623 "write": true, 00:19:54.623 "unmap": true, 00:19:54.623 "flush": true, 00:19:54.623 "reset": true, 00:19:54.623 "nvme_admin": false, 00:19:54.623 "nvme_io": false, 00:19:54.623 "nvme_io_md": false, 00:19:54.623 "write_zeroes": true, 00:19:54.623 "zcopy": true, 00:19:54.623 "get_zone_info": false, 00:19:54.623 "zone_management": false, 00:19:54.623 "zone_append": false, 00:19:54.623 "compare": false, 00:19:54.623 "compare_and_write": false, 00:19:54.623 "abort": true, 00:19:54.623 "seek_hole": false, 00:19:54.623 "seek_data": false, 00:19:54.623 "copy": true, 00:19:54.623 "nvme_iov_md": false 00:19:54.623 }, 00:19:54.623 "memory_domains": [ 00:19:54.623 { 00:19:54.623 "dma_device_id": "system", 00:19:54.623 "dma_device_type": 1 00:19:54.623 }, 00:19:54.623 { 00:19:54.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.623 "dma_device_type": 2 00:19:54.623 } 00:19:54.623 ], 00:19:54.623 "driver_specific": { 00:19:54.623 "passthru": { 00:19:54.623 "name": "pt2", 00:19:54.623 "base_bdev_name": "malloc2" 00:19:54.623 } 00:19:54.623 } 00:19:54.623 }' 00:19:54.623 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.879 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.137 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.137 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.137 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.137 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:55.137 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.394 "name": "pt3", 00:19:55.394 "aliases": [ 00:19:55.394 "00000000-0000-0000-0000-000000000003" 00:19:55.394 ], 00:19:55.394 "product_name": "passthru", 00:19:55.394 "block_size": 512, 00:19:55.394 "num_blocks": 65536, 00:19:55.394 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:55.394 "assigned_rate_limits": { 00:19:55.394 "rw_ios_per_sec": 0, 00:19:55.394 "rw_mbytes_per_sec": 0, 00:19:55.394 "r_mbytes_per_sec": 0, 00:19:55.394 "w_mbytes_per_sec": 0 00:19:55.394 }, 00:19:55.394 "claimed": true, 00:19:55.394 "claim_type": "exclusive_write", 00:19:55.394 "zoned": false, 00:19:55.394 "supported_io_types": { 00:19:55.394 "read": true, 00:19:55.394 "write": true, 00:19:55.394 "unmap": true, 00:19:55.394 "flush": true, 00:19:55.394 "reset": true, 00:19:55.394 "nvme_admin": false, 00:19:55.394 "nvme_io": false, 00:19:55.394 "nvme_io_md": false, 00:19:55.394 "write_zeroes": true, 00:19:55.394 "zcopy": true, 00:19:55.394 "get_zone_info": false, 00:19:55.394 "zone_management": false, 00:19:55.394 "zone_append": false, 00:19:55.394 "compare": false, 00:19:55.394 "compare_and_write": false, 00:19:55.394 "abort": true, 00:19:55.394 "seek_hole": false, 00:19:55.394 "seek_data": false, 00:19:55.394 "copy": true, 00:19:55.394 "nvme_iov_md": false 00:19:55.394 }, 00:19:55.394 "memory_domains": [ 00:19:55.394 { 00:19:55.394 "dma_device_id": "system", 00:19:55.394 "dma_device_type": 1 00:19:55.394 }, 00:19:55.394 { 00:19:55.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.394 "dma_device_type": 2 00:19:55.394 } 00:19:55.394 ], 00:19:55.394 "driver_specific": { 00:19:55.394 "passthru": { 00:19:55.394 "name": "pt3", 00:19:55.394 "base_bdev_name": "malloc3" 00:19:55.394 } 00:19:55.394 } 00:19:55.394 }' 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.394 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.651 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.651 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:55.651 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.906 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.906 "name": "pt4", 00:19:55.906 "aliases": [ 00:19:55.906 "00000000-0000-0000-0000-000000000004" 00:19:55.906 ], 00:19:55.906 "product_name": "passthru", 00:19:55.906 "block_size": 512, 00:19:55.906 "num_blocks": 65536, 00:19:55.906 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:55.906 "assigned_rate_limits": { 00:19:55.906 "rw_ios_per_sec": 0, 00:19:55.906 "rw_mbytes_per_sec": 0, 00:19:55.906 "r_mbytes_per_sec": 0, 00:19:55.906 "w_mbytes_per_sec": 0 00:19:55.906 }, 00:19:55.906 "claimed": true, 00:19:55.906 "claim_type": "exclusive_write", 00:19:55.906 "zoned": false, 00:19:55.906 "supported_io_types": { 00:19:55.906 "read": true, 00:19:55.906 "write": true, 00:19:55.906 "unmap": true, 00:19:55.906 "flush": true, 00:19:55.906 "reset": true, 00:19:55.906 "nvme_admin": false, 00:19:55.906 "nvme_io": false, 00:19:55.906 "nvme_io_md": false, 00:19:55.906 "write_zeroes": true, 00:19:55.906 "zcopy": true, 00:19:55.906 "get_zone_info": false, 00:19:55.906 "zone_management": false, 00:19:55.906 "zone_append": false, 00:19:55.906 "compare": false, 00:19:55.906 "compare_and_write": false, 00:19:55.906 "abort": true, 00:19:55.906 "seek_hole": false, 00:19:55.906 "seek_data": false, 00:19:55.906 "copy": true, 00:19:55.906 "nvme_iov_md": false 00:19:55.906 }, 00:19:55.906 "memory_domains": [ 00:19:55.906 { 00:19:55.906 "dma_device_id": "system", 00:19:55.906 "dma_device_type": 1 00:19:55.906 }, 00:19:55.906 { 00:19:55.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.906 "dma_device_type": 2 00:19:55.906 } 00:19:55.906 ], 00:19:55.906 "driver_specific": { 00:19:55.906 "passthru": { 00:19:55.906 "name": "pt4", 00:19:55.906 "base_bdev_name": "malloc4" 00:19:55.906 } 00:19:55.906 } 00:19:55.906 }' 00:19:55.906 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.906 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.161 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:56.417 [2024-07-15 13:38:35.817896] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2df2724c-395f-46b0-9754-c83b7d1df818 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2df2724c-395f-46b0-9754-c83b7d1df818 ']' 00:19:56.417 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:56.673 [2024-07-15 13:38:36.058222] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:56.673 [2024-07-15 13:38:36.058244] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:56.673 [2024-07-15 13:38:36.058293] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:56.673 [2024-07-15 13:38:36.058360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:56.673 [2024-07-15 13:38:36.058371] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a77530 name raid_bdev1, state offline 00:19:56.673 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.673 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:56.929 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:56.929 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:56.929 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:56.929 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:57.185 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:57.185 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:57.442 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:57.442 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:57.700 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:57.700 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:57.956 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:57.956 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:58.213 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:58.776 [2024-07-15 13:38:38.027339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:58.776 [2024-07-15 13:38:38.028686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:58.776 [2024-07-15 13:38:38.028728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:58.776 [2024-07-15 13:38:38.028762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:58.776 [2024-07-15 13:38:38.028807] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:58.776 [2024-07-15 13:38:38.028851] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:58.776 [2024-07-15 13:38:38.028873] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:58.776 [2024-07-15 13:38:38.028903] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:58.776 [2024-07-15 13:38:38.028921] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:58.777 [2024-07-15 13:38:38.028938] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c22ff0 name raid_bdev1, state configuring 00:19:58.777 request: 00:19:58.777 { 00:19:58.777 "name": "raid_bdev1", 00:19:58.777 "raid_level": "concat", 00:19:58.777 "base_bdevs": [ 00:19:58.777 "malloc1", 00:19:58.777 "malloc2", 00:19:58.777 "malloc3", 00:19:58.777 "malloc4" 00:19:58.777 ], 00:19:58.777 "strip_size_kb": 64, 00:19:58.777 "superblock": false, 00:19:58.777 "method": "bdev_raid_create", 00:19:58.777 "req_id": 1 00:19:58.777 } 00:19:58.777 Got JSON-RPC error response 00:19:58.777 response: 00:19:58.777 { 00:19:58.777 "code": -17, 00:19:58.777 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:58.777 } 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.777 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:59.033 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:59.033 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:59.033 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:59.289 [2024-07-15 13:38:38.520577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:59.289 [2024-07-15 13:38:38.520622] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.289 [2024-07-15 13:38:38.520644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a7f7a0 00:19:59.289 [2024-07-15 13:38:38.520656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.289 [2024-07-15 13:38:38.522291] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.289 [2024-07-15 13:38:38.522318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:59.289 [2024-07-15 13:38:38.522382] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:59.289 [2024-07-15 13:38:38.522409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:59.289 pt1 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.289 13:38:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.853 13:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.853 "name": "raid_bdev1", 00:19:59.853 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:19:59.853 "strip_size_kb": 64, 00:19:59.853 "state": "configuring", 00:19:59.853 "raid_level": "concat", 00:19:59.853 "superblock": true, 00:19:59.853 "num_base_bdevs": 4, 00:19:59.853 "num_base_bdevs_discovered": 1, 00:19:59.853 "num_base_bdevs_operational": 4, 00:19:59.853 "base_bdevs_list": [ 00:19:59.853 { 00:19:59.853 "name": "pt1", 00:19:59.853 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:59.853 "is_configured": true, 00:19:59.853 "data_offset": 2048, 00:19:59.853 "data_size": 63488 00:19:59.853 }, 00:19:59.853 { 00:19:59.853 "name": null, 00:19:59.853 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:59.853 "is_configured": false, 00:19:59.853 "data_offset": 2048, 00:19:59.853 "data_size": 63488 00:19:59.853 }, 00:19:59.853 { 00:19:59.853 "name": null, 00:19:59.853 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:59.853 "is_configured": false, 00:19:59.853 "data_offset": 2048, 00:19:59.853 "data_size": 63488 00:19:59.853 }, 00:19:59.853 { 00:19:59.853 "name": null, 00:19:59.853 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:59.853 "is_configured": false, 00:19:59.853 "data_offset": 2048, 00:19:59.853 "data_size": 63488 00:19:59.853 } 00:19:59.853 ] 00:19:59.853 }' 00:19:59.853 13:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.853 13:38:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.416 13:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:00.416 13:38:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:00.979 [2024-07-15 13:38:40.112919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:00.979 [2024-07-15 13:38:40.112983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.980 [2024-07-15 13:38:40.113002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a76ea0 00:20:00.980 [2024-07-15 13:38:40.113015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.980 [2024-07-15 13:38:40.113367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.980 [2024-07-15 13:38:40.113387] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:00.980 [2024-07-15 13:38:40.113451] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:00.980 [2024-07-15 13:38:40.113471] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:00.980 pt2 00:20:00.980 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:01.236 [2024-07-15 13:38:40.449831] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.236 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.493 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.493 "name": "raid_bdev1", 00:20:01.493 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:20:01.493 "strip_size_kb": 64, 00:20:01.493 "state": "configuring", 00:20:01.493 "raid_level": "concat", 00:20:01.493 "superblock": true, 00:20:01.493 "num_base_bdevs": 4, 00:20:01.493 "num_base_bdevs_discovered": 1, 00:20:01.493 "num_base_bdevs_operational": 4, 00:20:01.493 "base_bdevs_list": [ 00:20:01.493 { 00:20:01.493 "name": "pt1", 00:20:01.493 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:01.493 "is_configured": true, 00:20:01.493 "data_offset": 2048, 00:20:01.493 "data_size": 63488 00:20:01.493 }, 00:20:01.493 { 00:20:01.493 "name": null, 00:20:01.493 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:01.493 "is_configured": false, 00:20:01.493 "data_offset": 2048, 00:20:01.493 "data_size": 63488 00:20:01.493 }, 00:20:01.493 { 00:20:01.493 "name": null, 00:20:01.493 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:01.493 "is_configured": false, 00:20:01.493 "data_offset": 2048, 00:20:01.493 "data_size": 63488 00:20:01.493 }, 00:20:01.493 { 00:20:01.493 "name": null, 00:20:01.493 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:01.493 "is_configured": false, 00:20:01.493 "data_offset": 2048, 00:20:01.493 "data_size": 63488 00:20:01.493 } 00:20:01.493 ] 00:20:01.493 }' 00:20:01.493 13:38:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.493 13:38:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.056 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:02.056 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:02.056 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:02.312 [2024-07-15 13:38:41.496567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:02.312 [2024-07-15 13:38:41.496616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.312 [2024-07-15 13:38:41.496634] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a75ec0 00:20:02.312 [2024-07-15 13:38:41.496647] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.312 [2024-07-15 13:38:41.496999] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.312 [2024-07-15 13:38:41.497018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:02.312 [2024-07-15 13:38:41.497079] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:02.312 [2024-07-15 13:38:41.497098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:02.312 pt2 00:20:02.312 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:02.312 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:02.312 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:02.312 [2024-07-15 13:38:41.737216] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:02.312 [2024-07-15 13:38:41.737251] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.312 [2024-07-15 13:38:41.737267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a760f0 00:20:02.312 [2024-07-15 13:38:41.737280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.312 [2024-07-15 13:38:41.737577] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.312 [2024-07-15 13:38:41.737595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:02.312 [2024-07-15 13:38:41.737648] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:02.312 [2024-07-15 13:38:41.737665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:02.569 pt3 00:20:02.569 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:02.569 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:02.569 13:38:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:02.826 [2024-07-15 13:38:42.238550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:02.826 [2024-07-15 13:38:42.238586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.826 [2024-07-15 13:38:42.238602] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a7eaf0 00:20:02.826 [2024-07-15 13:38:42.238614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.826 [2024-07-15 13:38:42.238921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.826 [2024-07-15 13:38:42.238945] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:02.826 [2024-07-15 13:38:42.239000] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:02.826 [2024-07-15 13:38:42.239018] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:02.826 [2024-07-15 13:38:42.239140] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a788f0 00:20:02.826 [2024-07-15 13:38:42.239150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:02.826 [2024-07-15 13:38:42.239321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a78150 00:20:02.826 [2024-07-15 13:38:42.239449] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a788f0 00:20:02.826 [2024-07-15 13:38:42.239458] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a788f0 00:20:02.826 [2024-07-15 13:38:42.239557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.826 pt4 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.083 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.340 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.340 "name": "raid_bdev1", 00:20:03.340 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:20:03.340 "strip_size_kb": 64, 00:20:03.340 "state": "online", 00:20:03.340 "raid_level": "concat", 00:20:03.340 "superblock": true, 00:20:03.340 "num_base_bdevs": 4, 00:20:03.340 "num_base_bdevs_discovered": 4, 00:20:03.340 "num_base_bdevs_operational": 4, 00:20:03.340 "base_bdevs_list": [ 00:20:03.340 { 00:20:03.340 "name": "pt1", 00:20:03.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:03.340 "is_configured": true, 00:20:03.340 "data_offset": 2048, 00:20:03.340 "data_size": 63488 00:20:03.340 }, 00:20:03.340 { 00:20:03.340 "name": "pt2", 00:20:03.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:03.340 "is_configured": true, 00:20:03.340 "data_offset": 2048, 00:20:03.340 "data_size": 63488 00:20:03.340 }, 00:20:03.340 { 00:20:03.340 "name": "pt3", 00:20:03.340 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:03.340 "is_configured": true, 00:20:03.340 "data_offset": 2048, 00:20:03.340 "data_size": 63488 00:20:03.340 }, 00:20:03.340 { 00:20:03.340 "name": "pt4", 00:20:03.340 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:03.340 "is_configured": true, 00:20:03.340 "data_offset": 2048, 00:20:03.340 "data_size": 63488 00:20:03.340 } 00:20:03.340 ] 00:20:03.340 }' 00:20:03.340 13:38:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.340 13:38:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:03.903 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:04.158 [2024-07-15 13:38:43.349835] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:04.158 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:04.158 "name": "raid_bdev1", 00:20:04.158 "aliases": [ 00:20:04.158 "2df2724c-395f-46b0-9754-c83b7d1df818" 00:20:04.158 ], 00:20:04.158 "product_name": "Raid Volume", 00:20:04.158 "block_size": 512, 00:20:04.158 "num_blocks": 253952, 00:20:04.158 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:20:04.158 "assigned_rate_limits": { 00:20:04.158 "rw_ios_per_sec": 0, 00:20:04.158 "rw_mbytes_per_sec": 0, 00:20:04.158 "r_mbytes_per_sec": 0, 00:20:04.158 "w_mbytes_per_sec": 0 00:20:04.159 }, 00:20:04.159 "claimed": false, 00:20:04.159 "zoned": false, 00:20:04.159 "supported_io_types": { 00:20:04.159 "read": true, 00:20:04.159 "write": true, 00:20:04.159 "unmap": true, 00:20:04.159 "flush": true, 00:20:04.159 "reset": true, 00:20:04.159 "nvme_admin": false, 00:20:04.159 "nvme_io": false, 00:20:04.159 "nvme_io_md": false, 00:20:04.159 "write_zeroes": true, 00:20:04.159 "zcopy": false, 00:20:04.159 "get_zone_info": false, 00:20:04.159 "zone_management": false, 00:20:04.159 "zone_append": false, 00:20:04.159 "compare": false, 00:20:04.159 "compare_and_write": false, 00:20:04.159 "abort": false, 00:20:04.159 "seek_hole": false, 00:20:04.159 "seek_data": false, 00:20:04.159 "copy": false, 00:20:04.159 "nvme_iov_md": false 00:20:04.159 }, 00:20:04.159 "memory_domains": [ 00:20:04.159 { 00:20:04.159 "dma_device_id": "system", 00:20:04.159 "dma_device_type": 1 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.159 "dma_device_type": 2 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "system", 00:20:04.159 "dma_device_type": 1 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.159 "dma_device_type": 2 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "system", 00:20:04.159 "dma_device_type": 1 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.159 "dma_device_type": 2 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "system", 00:20:04.159 "dma_device_type": 1 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.159 "dma_device_type": 2 00:20:04.159 } 00:20:04.159 ], 00:20:04.159 "driver_specific": { 00:20:04.159 "raid": { 00:20:04.159 "uuid": "2df2724c-395f-46b0-9754-c83b7d1df818", 00:20:04.159 "strip_size_kb": 64, 00:20:04.159 "state": "online", 00:20:04.159 "raid_level": "concat", 00:20:04.159 "superblock": true, 00:20:04.159 "num_base_bdevs": 4, 00:20:04.159 "num_base_bdevs_discovered": 4, 00:20:04.159 "num_base_bdevs_operational": 4, 00:20:04.159 "base_bdevs_list": [ 00:20:04.159 { 00:20:04.159 "name": "pt1", 00:20:04.159 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:04.159 "is_configured": true, 00:20:04.159 "data_offset": 2048, 00:20:04.159 "data_size": 63488 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "name": "pt2", 00:20:04.159 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:04.159 "is_configured": true, 00:20:04.159 "data_offset": 2048, 00:20:04.159 "data_size": 63488 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "name": "pt3", 00:20:04.159 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:04.159 "is_configured": true, 00:20:04.159 "data_offset": 2048, 00:20:04.159 "data_size": 63488 00:20:04.159 }, 00:20:04.159 { 00:20:04.159 "name": "pt4", 00:20:04.159 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:04.159 "is_configured": true, 00:20:04.159 "data_offset": 2048, 00:20:04.159 "data_size": 63488 00:20:04.159 } 00:20:04.159 ] 00:20:04.159 } 00:20:04.159 } 00:20:04.159 }' 00:20:04.159 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:04.159 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:04.159 pt2 00:20:04.159 pt3 00:20:04.159 pt4' 00:20:04.159 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.159 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.159 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.415 "name": "pt1", 00:20:04.415 "aliases": [ 00:20:04.415 "00000000-0000-0000-0000-000000000001" 00:20:04.415 ], 00:20:04.415 "product_name": "passthru", 00:20:04.415 "block_size": 512, 00:20:04.415 "num_blocks": 65536, 00:20:04.415 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:04.415 "assigned_rate_limits": { 00:20:04.415 "rw_ios_per_sec": 0, 00:20:04.415 "rw_mbytes_per_sec": 0, 00:20:04.415 "r_mbytes_per_sec": 0, 00:20:04.415 "w_mbytes_per_sec": 0 00:20:04.415 }, 00:20:04.415 "claimed": true, 00:20:04.415 "claim_type": "exclusive_write", 00:20:04.415 "zoned": false, 00:20:04.415 "supported_io_types": { 00:20:04.415 "read": true, 00:20:04.415 "write": true, 00:20:04.415 "unmap": true, 00:20:04.415 "flush": true, 00:20:04.415 "reset": true, 00:20:04.415 "nvme_admin": false, 00:20:04.415 "nvme_io": false, 00:20:04.415 "nvme_io_md": false, 00:20:04.415 "write_zeroes": true, 00:20:04.415 "zcopy": true, 00:20:04.415 "get_zone_info": false, 00:20:04.415 "zone_management": false, 00:20:04.415 "zone_append": false, 00:20:04.415 "compare": false, 00:20:04.415 "compare_and_write": false, 00:20:04.415 "abort": true, 00:20:04.415 "seek_hole": false, 00:20:04.415 "seek_data": false, 00:20:04.415 "copy": true, 00:20:04.415 "nvme_iov_md": false 00:20:04.415 }, 00:20:04.415 "memory_domains": [ 00:20:04.415 { 00:20:04.415 "dma_device_id": "system", 00:20:04.415 "dma_device_type": 1 00:20:04.415 }, 00:20:04.415 { 00:20:04.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.415 "dma_device_type": 2 00:20:04.415 } 00:20:04.415 ], 00:20:04.415 "driver_specific": { 00:20:04.415 "passthru": { 00:20:04.415 "name": "pt1", 00:20:04.415 "base_bdev_name": "malloc1" 00:20:04.415 } 00:20:04.415 } 00:20:04.415 }' 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.415 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:04.671 13:38:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.926 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.926 "name": "pt2", 00:20:04.926 "aliases": [ 00:20:04.926 "00000000-0000-0000-0000-000000000002" 00:20:04.926 ], 00:20:04.926 "product_name": "passthru", 00:20:04.926 "block_size": 512, 00:20:04.926 "num_blocks": 65536, 00:20:04.926 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:04.926 "assigned_rate_limits": { 00:20:04.926 "rw_ios_per_sec": 0, 00:20:04.926 "rw_mbytes_per_sec": 0, 00:20:04.926 "r_mbytes_per_sec": 0, 00:20:04.926 "w_mbytes_per_sec": 0 00:20:04.926 }, 00:20:04.926 "claimed": true, 00:20:04.927 "claim_type": "exclusive_write", 00:20:04.927 "zoned": false, 00:20:04.927 "supported_io_types": { 00:20:04.927 "read": true, 00:20:04.927 "write": true, 00:20:04.927 "unmap": true, 00:20:04.927 "flush": true, 00:20:04.927 "reset": true, 00:20:04.927 "nvme_admin": false, 00:20:04.927 "nvme_io": false, 00:20:04.927 "nvme_io_md": false, 00:20:04.927 "write_zeroes": true, 00:20:04.927 "zcopy": true, 00:20:04.927 "get_zone_info": false, 00:20:04.927 "zone_management": false, 00:20:04.927 "zone_append": false, 00:20:04.927 "compare": false, 00:20:04.927 "compare_and_write": false, 00:20:04.927 "abort": true, 00:20:04.927 "seek_hole": false, 00:20:04.927 "seek_data": false, 00:20:04.927 "copy": true, 00:20:04.927 "nvme_iov_md": false 00:20:04.927 }, 00:20:04.927 "memory_domains": [ 00:20:04.927 { 00:20:04.927 "dma_device_id": "system", 00:20:04.927 "dma_device_type": 1 00:20:04.927 }, 00:20:04.927 { 00:20:04.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.927 "dma_device_type": 2 00:20:04.927 } 00:20:04.927 ], 00:20:04.927 "driver_specific": { 00:20:04.927 "passthru": { 00:20:04.927 "name": "pt2", 00:20:04.927 "base_bdev_name": "malloc2" 00:20:04.927 } 00:20:04.927 } 00:20:04.927 }' 00:20:04.927 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.927 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.927 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.927 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:05.183 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:05.440 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:05.440 "name": "pt3", 00:20:05.440 "aliases": [ 00:20:05.440 "00000000-0000-0000-0000-000000000003" 00:20:05.440 ], 00:20:05.440 "product_name": "passthru", 00:20:05.440 "block_size": 512, 00:20:05.440 "num_blocks": 65536, 00:20:05.440 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:05.440 "assigned_rate_limits": { 00:20:05.440 "rw_ios_per_sec": 0, 00:20:05.440 "rw_mbytes_per_sec": 0, 00:20:05.440 "r_mbytes_per_sec": 0, 00:20:05.440 "w_mbytes_per_sec": 0 00:20:05.440 }, 00:20:05.440 "claimed": true, 00:20:05.440 "claim_type": "exclusive_write", 00:20:05.440 "zoned": false, 00:20:05.440 "supported_io_types": { 00:20:05.440 "read": true, 00:20:05.440 "write": true, 00:20:05.440 "unmap": true, 00:20:05.440 "flush": true, 00:20:05.440 "reset": true, 00:20:05.440 "nvme_admin": false, 00:20:05.440 "nvme_io": false, 00:20:05.440 "nvme_io_md": false, 00:20:05.440 "write_zeroes": true, 00:20:05.440 "zcopy": true, 00:20:05.440 "get_zone_info": false, 00:20:05.440 "zone_management": false, 00:20:05.440 "zone_append": false, 00:20:05.440 "compare": false, 00:20:05.440 "compare_and_write": false, 00:20:05.440 "abort": true, 00:20:05.440 "seek_hole": false, 00:20:05.440 "seek_data": false, 00:20:05.440 "copy": true, 00:20:05.440 "nvme_iov_md": false 00:20:05.440 }, 00:20:05.440 "memory_domains": [ 00:20:05.440 { 00:20:05.440 "dma_device_id": "system", 00:20:05.440 "dma_device_type": 1 00:20:05.440 }, 00:20:05.440 { 00:20:05.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.440 "dma_device_type": 2 00:20:05.440 } 00:20:05.440 ], 00:20:05.440 "driver_specific": { 00:20:05.440 "passthru": { 00:20:05.440 "name": "pt3", 00:20:05.440 "base_bdev_name": "malloc3" 00:20:05.440 } 00:20:05.440 } 00:20:05.440 }' 00:20:05.440 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.697 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.697 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:05.697 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.697 13:38:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.697 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.697 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.697 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.697 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.697 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.971 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.971 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.971 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:05.971 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:05.971 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:06.244 "name": "pt4", 00:20:06.244 "aliases": [ 00:20:06.244 "00000000-0000-0000-0000-000000000004" 00:20:06.244 ], 00:20:06.244 "product_name": "passthru", 00:20:06.244 "block_size": 512, 00:20:06.244 "num_blocks": 65536, 00:20:06.244 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:06.244 "assigned_rate_limits": { 00:20:06.244 "rw_ios_per_sec": 0, 00:20:06.244 "rw_mbytes_per_sec": 0, 00:20:06.244 "r_mbytes_per_sec": 0, 00:20:06.244 "w_mbytes_per_sec": 0 00:20:06.244 }, 00:20:06.244 "claimed": true, 00:20:06.244 "claim_type": "exclusive_write", 00:20:06.244 "zoned": false, 00:20:06.244 "supported_io_types": { 00:20:06.244 "read": true, 00:20:06.244 "write": true, 00:20:06.244 "unmap": true, 00:20:06.244 "flush": true, 00:20:06.244 "reset": true, 00:20:06.244 "nvme_admin": false, 00:20:06.244 "nvme_io": false, 00:20:06.244 "nvme_io_md": false, 00:20:06.244 "write_zeroes": true, 00:20:06.244 "zcopy": true, 00:20:06.244 "get_zone_info": false, 00:20:06.244 "zone_management": false, 00:20:06.244 "zone_append": false, 00:20:06.244 "compare": false, 00:20:06.244 "compare_and_write": false, 00:20:06.244 "abort": true, 00:20:06.244 "seek_hole": false, 00:20:06.244 "seek_data": false, 00:20:06.244 "copy": true, 00:20:06.244 "nvme_iov_md": false 00:20:06.244 }, 00:20:06.244 "memory_domains": [ 00:20:06.244 { 00:20:06.244 "dma_device_id": "system", 00:20:06.244 "dma_device_type": 1 00:20:06.244 }, 00:20:06.244 { 00:20:06.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.244 "dma_device_type": 2 00:20:06.244 } 00:20:06.244 ], 00:20:06.244 "driver_specific": { 00:20:06.244 "passthru": { 00:20:06.244 "name": "pt4", 00:20:06.244 "base_bdev_name": "malloc4" 00:20:06.244 } 00:20:06.244 } 00:20:06.244 }' 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.244 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:06.517 13:38:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:06.775 [2024-07-15 13:38:46.008900] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2df2724c-395f-46b0-9754-c83b7d1df818 '!=' 2df2724c-395f-46b0-9754-c83b7d1df818 ']' 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2152962 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2152962 ']' 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2152962 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2152962 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2152962' 00:20:06.775 killing process with pid 2152962 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2152962 00:20:06.775 [2024-07-15 13:38:46.076164] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:06.775 [2024-07-15 13:38:46.076232] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:06.775 [2024-07-15 13:38:46.076296] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:06.775 [2024-07-15 13:38:46.076308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a788f0 name raid_bdev1, state offline 00:20:06.775 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2152962 00:20:06.775 [2024-07-15 13:38:46.117490] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:07.033 13:38:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:07.033 00:20:07.033 real 0m17.954s 00:20:07.033 user 0m32.588s 00:20:07.033 sys 0m3.045s 00:20:07.033 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:07.033 13:38:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.033 ************************************ 00:20:07.033 END TEST raid_superblock_test 00:20:07.033 ************************************ 00:20:07.033 13:38:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:07.033 13:38:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:07.033 13:38:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:07.033 13:38:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:07.033 13:38:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:07.033 ************************************ 00:20:07.033 START TEST raid_read_error_test 00:20:07.033 ************************************ 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aW15CTtzQO 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2155576 00:20:07.033 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2155576 /var/tmp/spdk-raid.sock 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2155576 ']' 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:07.034 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:07.034 13:38:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.291 [2024-07-15 13:38:46.498969] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:20:07.291 [2024-07-15 13:38:46.499036] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2155576 ] 00:20:07.291 [2024-07-15 13:38:46.629750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:07.549 [2024-07-15 13:38:46.737446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:07.549 [2024-07-15 13:38:46.804083] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:07.549 [2024-07-15 13:38:46.804110] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:08.113 13:38:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:08.113 13:38:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:08.113 13:38:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:08.113 13:38:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:08.370 BaseBdev1_malloc 00:20:08.370 13:38:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:08.627 true 00:20:08.627 13:38:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:09.192 [2024-07-15 13:38:48.427937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:09.192 [2024-07-15 13:38:48.427986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.192 [2024-07-15 13:38:48.428007] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240d0d0 00:20:09.192 [2024-07-15 13:38:48.428027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.192 [2024-07-15 13:38:48.429923] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.192 [2024-07-15 13:38:48.429961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:09.192 BaseBdev1 00:20:09.192 13:38:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:09.192 13:38:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:09.449 BaseBdev2_malloc 00:20:09.449 13:38:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:10.013 true 00:20:10.013 13:38:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:10.271 [2024-07-15 13:38:49.448574] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:10.271 [2024-07-15 13:38:49.448621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.271 [2024-07-15 13:38:49.448643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2411910 00:20:10.271 [2024-07-15 13:38:49.448657] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.271 [2024-07-15 13:38:49.450233] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.271 [2024-07-15 13:38:49.450263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:10.271 BaseBdev2 00:20:10.271 13:38:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:10.271 13:38:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:10.271 BaseBdev3_malloc 00:20:10.529 13:38:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:10.787 true 00:20:11.045 13:38:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:11.045 [2024-07-15 13:38:50.457103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:11.045 [2024-07-15 13:38:50.457152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:11.045 [2024-07-15 13:38:50.457173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2413bd0 00:20:11.045 [2024-07-15 13:38:50.457186] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:11.045 [2024-07-15 13:38:50.458782] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:11.045 [2024-07-15 13:38:50.458810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:11.045 BaseBdev3 00:20:11.302 13:38:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:11.302 13:38:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:11.560 BaseBdev4_malloc 00:20:11.817 13:38:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:11.817 true 00:20:11.817 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:12.074 [2024-07-15 13:38:51.468353] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:12.074 [2024-07-15 13:38:51.468399] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:12.074 [2024-07-15 13:38:51.468429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2414aa0 00:20:12.074 [2024-07-15 13:38:51.468442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:12.075 [2024-07-15 13:38:51.469878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:12.075 [2024-07-15 13:38:51.469909] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:12.075 BaseBdev4 00:20:12.075 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:12.333 [2024-07-15 13:38:51.717057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:12.333 [2024-07-15 13:38:51.718473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:12.333 [2024-07-15 13:38:51.718543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:12.333 [2024-07-15 13:38:51.718604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:12.333 [2024-07-15 13:38:51.718841] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x240ec20 00:20:12.333 [2024-07-15 13:38:51.718853] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:12.333 [2024-07-15 13:38:51.719066] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2263260 00:20:12.333 [2024-07-15 13:38:51.719220] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x240ec20 00:20:12.333 [2024-07-15 13:38:51.719230] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x240ec20 00:20:12.333 [2024-07-15 13:38:51.719338] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.333 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.591 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.591 "name": "raid_bdev1", 00:20:12.591 "uuid": "21d4f958-39ef-43f2-96eb-938baa1dc490", 00:20:12.591 "strip_size_kb": 64, 00:20:12.591 "state": "online", 00:20:12.591 "raid_level": "concat", 00:20:12.591 "superblock": true, 00:20:12.591 "num_base_bdevs": 4, 00:20:12.591 "num_base_bdevs_discovered": 4, 00:20:12.591 "num_base_bdevs_operational": 4, 00:20:12.591 "base_bdevs_list": [ 00:20:12.591 { 00:20:12.591 "name": "BaseBdev1", 00:20:12.591 "uuid": "fe9e3e51-d774-55a4-88f8-8801fac0906d", 00:20:12.591 "is_configured": true, 00:20:12.591 "data_offset": 2048, 00:20:12.591 "data_size": 63488 00:20:12.591 }, 00:20:12.591 { 00:20:12.591 "name": "BaseBdev2", 00:20:12.591 "uuid": "858dedc9-59a3-5bc3-a56b-a02f148a54e1", 00:20:12.591 "is_configured": true, 00:20:12.591 "data_offset": 2048, 00:20:12.591 "data_size": 63488 00:20:12.591 }, 00:20:12.591 { 00:20:12.591 "name": "BaseBdev3", 00:20:12.591 "uuid": "4caa95b4-cedf-5e65-97a5-65361b50f685", 00:20:12.591 "is_configured": true, 00:20:12.591 "data_offset": 2048, 00:20:12.591 "data_size": 63488 00:20:12.591 }, 00:20:12.591 { 00:20:12.591 "name": "BaseBdev4", 00:20:12.591 "uuid": "2e53f300-a818-5fa3-ab30-1d5a1fce7042", 00:20:12.591 "is_configured": true, 00:20:12.591 "data_offset": 2048, 00:20:12.591 "data_size": 63488 00:20:12.591 } 00:20:12.591 ] 00:20:12.591 }' 00:20:12.591 13:38:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.591 13:38:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.524 13:38:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:13.524 13:38:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:13.524 [2024-07-15 13:38:52.691901] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2400fc0 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.456 13:38:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.713 13:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.713 "name": "raid_bdev1", 00:20:14.713 "uuid": "21d4f958-39ef-43f2-96eb-938baa1dc490", 00:20:14.713 "strip_size_kb": 64, 00:20:14.713 "state": "online", 00:20:14.713 "raid_level": "concat", 00:20:14.713 "superblock": true, 00:20:14.713 "num_base_bdevs": 4, 00:20:14.713 "num_base_bdevs_discovered": 4, 00:20:14.713 "num_base_bdevs_operational": 4, 00:20:14.713 "base_bdevs_list": [ 00:20:14.713 { 00:20:14.713 "name": "BaseBdev1", 00:20:14.713 "uuid": "fe9e3e51-d774-55a4-88f8-8801fac0906d", 00:20:14.713 "is_configured": true, 00:20:14.713 "data_offset": 2048, 00:20:14.713 "data_size": 63488 00:20:14.713 }, 00:20:14.713 { 00:20:14.713 "name": "BaseBdev2", 00:20:14.713 "uuid": "858dedc9-59a3-5bc3-a56b-a02f148a54e1", 00:20:14.713 "is_configured": true, 00:20:14.713 "data_offset": 2048, 00:20:14.713 "data_size": 63488 00:20:14.713 }, 00:20:14.713 { 00:20:14.713 "name": "BaseBdev3", 00:20:14.713 "uuid": "4caa95b4-cedf-5e65-97a5-65361b50f685", 00:20:14.713 "is_configured": true, 00:20:14.713 "data_offset": 2048, 00:20:14.713 "data_size": 63488 00:20:14.713 }, 00:20:14.713 { 00:20:14.713 "name": "BaseBdev4", 00:20:14.713 "uuid": "2e53f300-a818-5fa3-ab30-1d5a1fce7042", 00:20:14.713 "is_configured": true, 00:20:14.713 "data_offset": 2048, 00:20:14.713 "data_size": 63488 00:20:14.713 } 00:20:14.713 ] 00:20:14.713 }' 00:20:14.713 13:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.713 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.277 13:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:15.533 [2024-07-15 13:38:54.932991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:15.533 [2024-07-15 13:38:54.933034] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:15.533 [2024-07-15 13:38:54.936393] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:15.533 [2024-07-15 13:38:54.936432] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:15.533 [2024-07-15 13:38:54.936473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:15.533 [2024-07-15 13:38:54.936484] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x240ec20 name raid_bdev1, state offline 00:20:15.533 0 00:20:15.533 13:38:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2155576 00:20:15.533 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2155576 ']' 00:20:15.533 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2155576 00:20:15.533 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:15.789 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:15.789 13:38:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2155576 00:20:15.789 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:15.789 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:15.789 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2155576' 00:20:15.789 killing process with pid 2155576 00:20:15.789 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2155576 00:20:15.789 [2024-07-15 13:38:55.005494] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:15.789 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2155576 00:20:15.789 [2024-07-15 13:38:55.036911] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aW15CTtzQO 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:16.046 00:20:16.046 real 0m8.857s 00:20:16.046 user 0m14.503s 00:20:16.046 sys 0m1.463s 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:16.046 13:38:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.046 ************************************ 00:20:16.046 END TEST raid_read_error_test 00:20:16.046 ************************************ 00:20:16.046 13:38:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:16.046 13:38:55 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:16.046 13:38:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:16.046 13:38:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:16.046 13:38:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:16.046 ************************************ 00:20:16.046 START TEST raid_write_error_test 00:20:16.046 ************************************ 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7pgGZqsCB6 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2156891 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2156891 /var/tmp/spdk-raid.sock 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2156891 ']' 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:16.046 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:16.046 13:38:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.046 [2024-07-15 13:38:55.438801] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:20:16.046 [2024-07-15 13:38:55.438863] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2156891 ] 00:20:16.302 [2024-07-15 13:38:55.567906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:16.302 [2024-07-15 13:38:55.665965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:16.302 [2024-07-15 13:38:55.725408] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:16.302 [2024-07-15 13:38:55.725434] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:17.234 13:38:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:17.234 13:38:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:17.234 13:38:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:17.234 13:38:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:17.234 BaseBdev1_malloc 00:20:17.234 13:38:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:17.493 true 00:20:17.493 13:38:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:17.750 [2024-07-15 13:38:57.006517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:17.750 [2024-07-15 13:38:57.006560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.750 [2024-07-15 13:38:57.006579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22190d0 00:20:17.750 [2024-07-15 13:38:57.006591] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.750 [2024-07-15 13:38:57.008295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.750 [2024-07-15 13:38:57.008323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:17.750 BaseBdev1 00:20:17.750 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:17.750 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:18.008 BaseBdev2_malloc 00:20:18.008 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:18.265 true 00:20:18.265 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:18.523 [2024-07-15 13:38:57.736942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:18.524 [2024-07-15 13:38:57.736984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:18.524 [2024-07-15 13:38:57.737002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221d910 00:20:18.524 [2024-07-15 13:38:57.737015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:18.524 [2024-07-15 13:38:57.738530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:18.524 [2024-07-15 13:38:57.738557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:18.524 BaseBdev2 00:20:18.524 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:18.524 13:38:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:18.781 BaseBdev3_malloc 00:20:18.781 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:19.040 true 00:20:19.040 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:19.297 [2024-07-15 13:38:58.479403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:19.297 [2024-07-15 13:38:58.479444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.297 [2024-07-15 13:38:58.479463] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221fbd0 00:20:19.297 [2024-07-15 13:38:58.479481] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.297 [2024-07-15 13:38:58.481047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.297 [2024-07-15 13:38:58.481073] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:19.297 BaseBdev3 00:20:19.297 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:19.297 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:19.555 BaseBdev4_malloc 00:20:19.555 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:19.815 true 00:20:19.815 13:38:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:19.815 [2024-07-15 13:38:59.225878] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:19.815 [2024-07-15 13:38:59.225920] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.815 [2024-07-15 13:38:59.225946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2220aa0 00:20:19.815 [2024-07-15 13:38:59.225958] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.815 [2024-07-15 13:38:59.227381] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.815 [2024-07-15 13:38:59.227409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:19.815 BaseBdev4 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:20.135 [2024-07-15 13:38:59.474576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:20.135 [2024-07-15 13:38:59.475847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:20.135 [2024-07-15 13:38:59.475914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:20.135 [2024-07-15 13:38:59.475984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:20.135 [2024-07-15 13:38:59.476210] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x221ac20 00:20:20.135 [2024-07-15 13:38:59.476222] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:20.135 [2024-07-15 13:38:59.476406] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x206f260 00:20:20.135 [2024-07-15 13:38:59.476548] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x221ac20 00:20:20.135 [2024-07-15 13:38:59.476558] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x221ac20 00:20:20.135 [2024-07-15 13:38:59.476656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.135 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.393 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.393 "name": "raid_bdev1", 00:20:20.393 "uuid": "9ef3e77f-7bf0-4a04-8ca5-71e48f8fd516", 00:20:20.393 "strip_size_kb": 64, 00:20:20.393 "state": "online", 00:20:20.393 "raid_level": "concat", 00:20:20.393 "superblock": true, 00:20:20.393 "num_base_bdevs": 4, 00:20:20.393 "num_base_bdevs_discovered": 4, 00:20:20.393 "num_base_bdevs_operational": 4, 00:20:20.393 "base_bdevs_list": [ 00:20:20.393 { 00:20:20.393 "name": "BaseBdev1", 00:20:20.393 "uuid": "52d9cc5a-f04b-5bf6-9b37-d1f4bc32eaa9", 00:20:20.393 "is_configured": true, 00:20:20.393 "data_offset": 2048, 00:20:20.393 "data_size": 63488 00:20:20.393 }, 00:20:20.393 { 00:20:20.393 "name": "BaseBdev2", 00:20:20.393 "uuid": "5bbeb772-e87b-569f-856a-02b392d4d745", 00:20:20.393 "is_configured": true, 00:20:20.393 "data_offset": 2048, 00:20:20.393 "data_size": 63488 00:20:20.393 }, 00:20:20.393 { 00:20:20.393 "name": "BaseBdev3", 00:20:20.393 "uuid": "ff8d7a9b-ad4a-51f3-a362-2d38a98714f3", 00:20:20.393 "is_configured": true, 00:20:20.393 "data_offset": 2048, 00:20:20.393 "data_size": 63488 00:20:20.393 }, 00:20:20.393 { 00:20:20.393 "name": "BaseBdev4", 00:20:20.393 "uuid": "eff3a56b-9703-56b7-b567-ffc4c4f7d14f", 00:20:20.393 "is_configured": true, 00:20:20.393 "data_offset": 2048, 00:20:20.393 "data_size": 63488 00:20:20.393 } 00:20:20.393 ] 00:20:20.393 }' 00:20:20.393 13:38:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.393 13:38:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.074 13:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:21.074 13:39:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:21.074 [2024-07-15 13:39:00.445439] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x220cfc0 00:20:22.004 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.261 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.517 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.517 "name": "raid_bdev1", 00:20:22.517 "uuid": "9ef3e77f-7bf0-4a04-8ca5-71e48f8fd516", 00:20:22.517 "strip_size_kb": 64, 00:20:22.517 "state": "online", 00:20:22.517 "raid_level": "concat", 00:20:22.517 "superblock": true, 00:20:22.517 "num_base_bdevs": 4, 00:20:22.517 "num_base_bdevs_discovered": 4, 00:20:22.517 "num_base_bdevs_operational": 4, 00:20:22.517 "base_bdevs_list": [ 00:20:22.517 { 00:20:22.517 "name": "BaseBdev1", 00:20:22.517 "uuid": "52d9cc5a-f04b-5bf6-9b37-d1f4bc32eaa9", 00:20:22.517 "is_configured": true, 00:20:22.517 "data_offset": 2048, 00:20:22.517 "data_size": 63488 00:20:22.517 }, 00:20:22.517 { 00:20:22.517 "name": "BaseBdev2", 00:20:22.517 "uuid": "5bbeb772-e87b-569f-856a-02b392d4d745", 00:20:22.517 "is_configured": true, 00:20:22.517 "data_offset": 2048, 00:20:22.517 "data_size": 63488 00:20:22.517 }, 00:20:22.517 { 00:20:22.517 "name": "BaseBdev3", 00:20:22.517 "uuid": "ff8d7a9b-ad4a-51f3-a362-2d38a98714f3", 00:20:22.517 "is_configured": true, 00:20:22.517 "data_offset": 2048, 00:20:22.517 "data_size": 63488 00:20:22.517 }, 00:20:22.517 { 00:20:22.517 "name": "BaseBdev4", 00:20:22.517 "uuid": "eff3a56b-9703-56b7-b567-ffc4c4f7d14f", 00:20:22.517 "is_configured": true, 00:20:22.517 "data_offset": 2048, 00:20:22.517 "data_size": 63488 00:20:22.517 } 00:20:22.517 ] 00:20:22.517 }' 00:20:22.517 13:39:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.517 13:39:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.078 13:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:23.334 [2024-07-15 13:39:02.667439] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:23.334 [2024-07-15 13:39:02.667482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:23.334 [2024-07-15 13:39:02.670649] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:23.334 [2024-07-15 13:39:02.670686] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:23.334 [2024-07-15 13:39:02.670727] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:23.334 [2024-07-15 13:39:02.670739] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x221ac20 name raid_bdev1, state offline 00:20:23.334 0 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2156891 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2156891 ']' 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2156891 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2156891 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2156891' 00:20:23.334 killing process with pid 2156891 00:20:23.334 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2156891 00:20:23.334 [2024-07-15 13:39:02.749596] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:23.335 13:39:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2156891 00:20:23.592 [2024-07-15 13:39:02.782754] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:23.592 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7pgGZqsCB6 00:20:23.592 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:23.592 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:23.849 00:20:23.849 real 0m7.655s 00:20:23.849 user 0m12.234s 00:20:23.849 sys 0m1.350s 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:23.849 13:39:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.849 ************************************ 00:20:23.849 END TEST raid_write_error_test 00:20:23.849 ************************************ 00:20:23.849 13:39:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:23.849 13:39:03 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:23.849 13:39:03 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:23.849 13:39:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:23.849 13:39:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:23.849 13:39:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:23.849 ************************************ 00:20:23.849 START TEST raid_state_function_test 00:20:23.849 ************************************ 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:23.849 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2158156 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2158156' 00:20:23.850 Process raid pid: 2158156 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2158156 /var/tmp/spdk-raid.sock 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2158156 ']' 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:23.850 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:23.850 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.850 [2024-07-15 13:39:03.174366] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:20:23.850 [2024-07-15 13:39:03.174436] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:24.107 [2024-07-15 13:39:03.302674] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:24.107 [2024-07-15 13:39:03.406466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:24.107 [2024-07-15 13:39:03.462912] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:24.107 [2024-07-15 13:39:03.462942] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:25.040 [2024-07-15 13:39:04.330553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:25.040 [2024-07-15 13:39:04.330596] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:25.040 [2024-07-15 13:39:04.330607] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:25.040 [2024-07-15 13:39:04.330619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:25.040 [2024-07-15 13:39:04.330628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:25.040 [2024-07-15 13:39:04.330639] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:25.040 [2024-07-15 13:39:04.330648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:25.040 [2024-07-15 13:39:04.330659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.040 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.298 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.298 "name": "Existed_Raid", 00:20:25.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.298 "strip_size_kb": 0, 00:20:25.298 "state": "configuring", 00:20:25.298 "raid_level": "raid1", 00:20:25.298 "superblock": false, 00:20:25.298 "num_base_bdevs": 4, 00:20:25.298 "num_base_bdevs_discovered": 0, 00:20:25.298 "num_base_bdevs_operational": 4, 00:20:25.298 "base_bdevs_list": [ 00:20:25.298 { 00:20:25.298 "name": "BaseBdev1", 00:20:25.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.298 "is_configured": false, 00:20:25.298 "data_offset": 0, 00:20:25.298 "data_size": 0 00:20:25.298 }, 00:20:25.298 { 00:20:25.298 "name": "BaseBdev2", 00:20:25.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.298 "is_configured": false, 00:20:25.298 "data_offset": 0, 00:20:25.298 "data_size": 0 00:20:25.298 }, 00:20:25.298 { 00:20:25.298 "name": "BaseBdev3", 00:20:25.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.298 "is_configured": false, 00:20:25.298 "data_offset": 0, 00:20:25.298 "data_size": 0 00:20:25.298 }, 00:20:25.298 { 00:20:25.298 "name": "BaseBdev4", 00:20:25.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.298 "is_configured": false, 00:20:25.298 "data_offset": 0, 00:20:25.298 "data_size": 0 00:20:25.298 } 00:20:25.298 ] 00:20:25.298 }' 00:20:25.298 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.298 13:39:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.863 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:26.120 [2024-07-15 13:39:05.429331] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:26.120 [2024-07-15 13:39:05.429363] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcceaa0 name Existed_Raid, state configuring 00:20:26.120 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:26.378 [2024-07-15 13:39:05.669984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:26.378 [2024-07-15 13:39:05.670015] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:26.378 [2024-07-15 13:39:05.670025] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:26.378 [2024-07-15 13:39:05.670037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:26.378 [2024-07-15 13:39:05.670046] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:26.378 [2024-07-15 13:39:05.670057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:26.378 [2024-07-15 13:39:05.670066] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:26.378 [2024-07-15 13:39:05.670077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:26.378 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:26.636 [2024-07-15 13:39:05.924550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:26.636 BaseBdev1 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:26.636 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:26.894 13:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:27.152 [ 00:20:27.152 { 00:20:27.152 "name": "BaseBdev1", 00:20:27.152 "aliases": [ 00:20:27.152 "02ad51a4-09af-4451-bd9a-f77f1af67055" 00:20:27.152 ], 00:20:27.152 "product_name": "Malloc disk", 00:20:27.152 "block_size": 512, 00:20:27.152 "num_blocks": 65536, 00:20:27.152 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:27.152 "assigned_rate_limits": { 00:20:27.152 "rw_ios_per_sec": 0, 00:20:27.152 "rw_mbytes_per_sec": 0, 00:20:27.152 "r_mbytes_per_sec": 0, 00:20:27.152 "w_mbytes_per_sec": 0 00:20:27.152 }, 00:20:27.152 "claimed": true, 00:20:27.152 "claim_type": "exclusive_write", 00:20:27.152 "zoned": false, 00:20:27.152 "supported_io_types": { 00:20:27.152 "read": true, 00:20:27.152 "write": true, 00:20:27.152 "unmap": true, 00:20:27.152 "flush": true, 00:20:27.152 "reset": true, 00:20:27.152 "nvme_admin": false, 00:20:27.152 "nvme_io": false, 00:20:27.152 "nvme_io_md": false, 00:20:27.152 "write_zeroes": true, 00:20:27.152 "zcopy": true, 00:20:27.152 "get_zone_info": false, 00:20:27.152 "zone_management": false, 00:20:27.152 "zone_append": false, 00:20:27.152 "compare": false, 00:20:27.152 "compare_and_write": false, 00:20:27.152 "abort": true, 00:20:27.152 "seek_hole": false, 00:20:27.152 "seek_data": false, 00:20:27.152 "copy": true, 00:20:27.152 "nvme_iov_md": false 00:20:27.152 }, 00:20:27.152 "memory_domains": [ 00:20:27.152 { 00:20:27.152 "dma_device_id": "system", 00:20:27.152 "dma_device_type": 1 00:20:27.152 }, 00:20:27.152 { 00:20:27.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.152 "dma_device_type": 2 00:20:27.152 } 00:20:27.152 ], 00:20:27.152 "driver_specific": {} 00:20:27.152 } 00:20:27.152 ] 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.152 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:27.410 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.410 "name": "Existed_Raid", 00:20:27.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.410 "strip_size_kb": 0, 00:20:27.410 "state": "configuring", 00:20:27.410 "raid_level": "raid1", 00:20:27.410 "superblock": false, 00:20:27.410 "num_base_bdevs": 4, 00:20:27.410 "num_base_bdevs_discovered": 1, 00:20:27.410 "num_base_bdevs_operational": 4, 00:20:27.410 "base_bdevs_list": [ 00:20:27.410 { 00:20:27.410 "name": "BaseBdev1", 00:20:27.410 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:27.410 "is_configured": true, 00:20:27.410 "data_offset": 0, 00:20:27.410 "data_size": 65536 00:20:27.410 }, 00:20:27.410 { 00:20:27.410 "name": "BaseBdev2", 00:20:27.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.410 "is_configured": false, 00:20:27.410 "data_offset": 0, 00:20:27.410 "data_size": 0 00:20:27.410 }, 00:20:27.410 { 00:20:27.410 "name": "BaseBdev3", 00:20:27.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.410 "is_configured": false, 00:20:27.410 "data_offset": 0, 00:20:27.410 "data_size": 0 00:20:27.410 }, 00:20:27.410 { 00:20:27.410 "name": "BaseBdev4", 00:20:27.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.410 "is_configured": false, 00:20:27.410 "data_offset": 0, 00:20:27.410 "data_size": 0 00:20:27.410 } 00:20:27.410 ] 00:20:27.410 }' 00:20:27.410 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.410 13:39:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.976 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:28.234 [2024-07-15 13:39:07.488699] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:28.234 [2024-07-15 13:39:07.488740] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcce310 name Existed_Raid, state configuring 00:20:28.234 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:28.493 [2024-07-15 13:39:07.733378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:28.493 [2024-07-15 13:39:07.734816] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:28.493 [2024-07-15 13:39:07.734848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:28.493 [2024-07-15 13:39:07.734859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:28.493 [2024-07-15 13:39:07.734872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:28.493 [2024-07-15 13:39:07.734882] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:28.493 [2024-07-15 13:39:07.734893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.493 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.752 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.752 "name": "Existed_Raid", 00:20:28.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.752 "strip_size_kb": 0, 00:20:28.752 "state": "configuring", 00:20:28.752 "raid_level": "raid1", 00:20:28.752 "superblock": false, 00:20:28.752 "num_base_bdevs": 4, 00:20:28.752 "num_base_bdevs_discovered": 1, 00:20:28.752 "num_base_bdevs_operational": 4, 00:20:28.752 "base_bdevs_list": [ 00:20:28.752 { 00:20:28.752 "name": "BaseBdev1", 00:20:28.752 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:28.752 "is_configured": true, 00:20:28.752 "data_offset": 0, 00:20:28.752 "data_size": 65536 00:20:28.752 }, 00:20:28.752 { 00:20:28.752 "name": "BaseBdev2", 00:20:28.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.752 "is_configured": false, 00:20:28.752 "data_offset": 0, 00:20:28.752 "data_size": 0 00:20:28.752 }, 00:20:28.752 { 00:20:28.752 "name": "BaseBdev3", 00:20:28.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.752 "is_configured": false, 00:20:28.752 "data_offset": 0, 00:20:28.752 "data_size": 0 00:20:28.752 }, 00:20:28.752 { 00:20:28.752 "name": "BaseBdev4", 00:20:28.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.752 "is_configured": false, 00:20:28.752 "data_offset": 0, 00:20:28.752 "data_size": 0 00:20:28.752 } 00:20:28.752 ] 00:20:28.752 }' 00:20:28.752 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.752 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.318 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:29.575 [2024-07-15 13:39:08.819764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:29.575 BaseBdev2 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:29.575 13:39:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.834 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:29.834 [ 00:20:29.834 { 00:20:29.834 "name": "BaseBdev2", 00:20:29.834 "aliases": [ 00:20:29.834 "96bba3e4-4f76-47f3-ab57-9155e28e5a5b" 00:20:29.834 ], 00:20:29.834 "product_name": "Malloc disk", 00:20:29.834 "block_size": 512, 00:20:29.834 "num_blocks": 65536, 00:20:29.834 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:29.834 "assigned_rate_limits": { 00:20:29.834 "rw_ios_per_sec": 0, 00:20:29.834 "rw_mbytes_per_sec": 0, 00:20:29.834 "r_mbytes_per_sec": 0, 00:20:29.834 "w_mbytes_per_sec": 0 00:20:29.834 }, 00:20:29.834 "claimed": true, 00:20:29.834 "claim_type": "exclusive_write", 00:20:29.834 "zoned": false, 00:20:29.834 "supported_io_types": { 00:20:29.834 "read": true, 00:20:29.834 "write": true, 00:20:29.834 "unmap": true, 00:20:29.834 "flush": true, 00:20:29.834 "reset": true, 00:20:29.834 "nvme_admin": false, 00:20:29.834 "nvme_io": false, 00:20:29.834 "nvme_io_md": false, 00:20:29.834 "write_zeroes": true, 00:20:29.834 "zcopy": true, 00:20:29.834 "get_zone_info": false, 00:20:29.834 "zone_management": false, 00:20:29.834 "zone_append": false, 00:20:29.834 "compare": false, 00:20:29.834 "compare_and_write": false, 00:20:29.834 "abort": true, 00:20:29.834 "seek_hole": false, 00:20:29.834 "seek_data": false, 00:20:29.834 "copy": true, 00:20:29.834 "nvme_iov_md": false 00:20:29.834 }, 00:20:29.834 "memory_domains": [ 00:20:29.834 { 00:20:29.834 "dma_device_id": "system", 00:20:29.834 "dma_device_type": 1 00:20:29.834 }, 00:20:29.834 { 00:20:29.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.834 "dma_device_type": 2 00:20:29.834 } 00:20:29.834 ], 00:20:29.834 "driver_specific": {} 00:20:29.834 } 00:20:29.834 ] 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.091 "name": "Existed_Raid", 00:20:30.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.091 "strip_size_kb": 0, 00:20:30.091 "state": "configuring", 00:20:30.091 "raid_level": "raid1", 00:20:30.091 "superblock": false, 00:20:30.091 "num_base_bdevs": 4, 00:20:30.091 "num_base_bdevs_discovered": 2, 00:20:30.091 "num_base_bdevs_operational": 4, 00:20:30.091 "base_bdevs_list": [ 00:20:30.091 { 00:20:30.091 "name": "BaseBdev1", 00:20:30.091 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:30.091 "is_configured": true, 00:20:30.091 "data_offset": 0, 00:20:30.091 "data_size": 65536 00:20:30.091 }, 00:20:30.091 { 00:20:30.091 "name": "BaseBdev2", 00:20:30.091 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:30.091 "is_configured": true, 00:20:30.091 "data_offset": 0, 00:20:30.091 "data_size": 65536 00:20:30.091 }, 00:20:30.091 { 00:20:30.091 "name": "BaseBdev3", 00:20:30.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.091 "is_configured": false, 00:20:30.091 "data_offset": 0, 00:20:30.091 "data_size": 0 00:20:30.091 }, 00:20:30.091 { 00:20:30.091 "name": "BaseBdev4", 00:20:30.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.091 "is_configured": false, 00:20:30.091 "data_offset": 0, 00:20:30.091 "data_size": 0 00:20:30.091 } 00:20:30.091 ] 00:20:30.091 }' 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.091 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.658 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:30.915 [2024-07-15 13:39:10.263118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:30.915 BaseBdev3 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:30.915 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:31.172 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:31.429 [ 00:20:31.429 { 00:20:31.429 "name": "BaseBdev3", 00:20:31.429 "aliases": [ 00:20:31.429 "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f" 00:20:31.429 ], 00:20:31.429 "product_name": "Malloc disk", 00:20:31.429 "block_size": 512, 00:20:31.429 "num_blocks": 65536, 00:20:31.429 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:31.429 "assigned_rate_limits": { 00:20:31.429 "rw_ios_per_sec": 0, 00:20:31.429 "rw_mbytes_per_sec": 0, 00:20:31.429 "r_mbytes_per_sec": 0, 00:20:31.429 "w_mbytes_per_sec": 0 00:20:31.429 }, 00:20:31.429 "claimed": true, 00:20:31.429 "claim_type": "exclusive_write", 00:20:31.429 "zoned": false, 00:20:31.429 "supported_io_types": { 00:20:31.429 "read": true, 00:20:31.429 "write": true, 00:20:31.429 "unmap": true, 00:20:31.429 "flush": true, 00:20:31.429 "reset": true, 00:20:31.429 "nvme_admin": false, 00:20:31.429 "nvme_io": false, 00:20:31.429 "nvme_io_md": false, 00:20:31.429 "write_zeroes": true, 00:20:31.429 "zcopy": true, 00:20:31.429 "get_zone_info": false, 00:20:31.429 "zone_management": false, 00:20:31.429 "zone_append": false, 00:20:31.429 "compare": false, 00:20:31.429 "compare_and_write": false, 00:20:31.429 "abort": true, 00:20:31.429 "seek_hole": false, 00:20:31.429 "seek_data": false, 00:20:31.429 "copy": true, 00:20:31.429 "nvme_iov_md": false 00:20:31.429 }, 00:20:31.429 "memory_domains": [ 00:20:31.429 { 00:20:31.429 "dma_device_id": "system", 00:20:31.429 "dma_device_type": 1 00:20:31.429 }, 00:20:31.429 { 00:20:31.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.429 "dma_device_type": 2 00:20:31.429 } 00:20:31.429 ], 00:20:31.429 "driver_specific": {} 00:20:31.429 } 00:20:31.429 ] 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.429 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.686 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.686 "name": "Existed_Raid", 00:20:31.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.686 "strip_size_kb": 0, 00:20:31.686 "state": "configuring", 00:20:31.686 "raid_level": "raid1", 00:20:31.686 "superblock": false, 00:20:31.686 "num_base_bdevs": 4, 00:20:31.686 "num_base_bdevs_discovered": 3, 00:20:31.686 "num_base_bdevs_operational": 4, 00:20:31.686 "base_bdevs_list": [ 00:20:31.686 { 00:20:31.686 "name": "BaseBdev1", 00:20:31.686 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:31.686 "is_configured": true, 00:20:31.686 "data_offset": 0, 00:20:31.686 "data_size": 65536 00:20:31.686 }, 00:20:31.686 { 00:20:31.686 "name": "BaseBdev2", 00:20:31.686 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:31.686 "is_configured": true, 00:20:31.686 "data_offset": 0, 00:20:31.686 "data_size": 65536 00:20:31.686 }, 00:20:31.686 { 00:20:31.686 "name": "BaseBdev3", 00:20:31.686 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:31.686 "is_configured": true, 00:20:31.686 "data_offset": 0, 00:20:31.686 "data_size": 65536 00:20:31.686 }, 00:20:31.686 { 00:20:31.686 "name": "BaseBdev4", 00:20:31.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.686 "is_configured": false, 00:20:31.686 "data_offset": 0, 00:20:31.686 "data_size": 0 00:20:31.686 } 00:20:31.686 ] 00:20:31.686 }' 00:20:31.686 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.686 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.249 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:32.505 [2024-07-15 13:39:11.790558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:32.505 [2024-07-15 13:39:11.790599] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xccf350 00:20:32.505 [2024-07-15 13:39:11.790608] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:32.505 [2024-07-15 13:39:11.790857] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccf020 00:20:32.505 [2024-07-15 13:39:11.791002] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xccf350 00:20:32.505 [2024-07-15 13:39:11.791013] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xccf350 00:20:32.505 [2024-07-15 13:39:11.791183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:32.505 BaseBdev4 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:32.505 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:32.761 13:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:33.017 [ 00:20:33.017 { 00:20:33.017 "name": "BaseBdev4", 00:20:33.017 "aliases": [ 00:20:33.017 "ca3968aa-9c00-4c09-b8f4-71fafe47ce08" 00:20:33.017 ], 00:20:33.017 "product_name": "Malloc disk", 00:20:33.017 "block_size": 512, 00:20:33.017 "num_blocks": 65536, 00:20:33.017 "uuid": "ca3968aa-9c00-4c09-b8f4-71fafe47ce08", 00:20:33.017 "assigned_rate_limits": { 00:20:33.017 "rw_ios_per_sec": 0, 00:20:33.017 "rw_mbytes_per_sec": 0, 00:20:33.017 "r_mbytes_per_sec": 0, 00:20:33.017 "w_mbytes_per_sec": 0 00:20:33.017 }, 00:20:33.017 "claimed": true, 00:20:33.017 "claim_type": "exclusive_write", 00:20:33.017 "zoned": false, 00:20:33.017 "supported_io_types": { 00:20:33.017 "read": true, 00:20:33.017 "write": true, 00:20:33.017 "unmap": true, 00:20:33.017 "flush": true, 00:20:33.017 "reset": true, 00:20:33.017 "nvme_admin": false, 00:20:33.017 "nvme_io": false, 00:20:33.017 "nvme_io_md": false, 00:20:33.017 "write_zeroes": true, 00:20:33.017 "zcopy": true, 00:20:33.017 "get_zone_info": false, 00:20:33.017 "zone_management": false, 00:20:33.017 "zone_append": false, 00:20:33.017 "compare": false, 00:20:33.017 "compare_and_write": false, 00:20:33.017 "abort": true, 00:20:33.017 "seek_hole": false, 00:20:33.017 "seek_data": false, 00:20:33.017 "copy": true, 00:20:33.017 "nvme_iov_md": false 00:20:33.017 }, 00:20:33.017 "memory_domains": [ 00:20:33.017 { 00:20:33.017 "dma_device_id": "system", 00:20:33.018 "dma_device_type": 1 00:20:33.018 }, 00:20:33.018 { 00:20:33.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.018 "dma_device_type": 2 00:20:33.018 } 00:20:33.018 ], 00:20:33.018 "driver_specific": {} 00:20:33.018 } 00:20:33.018 ] 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.018 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.274 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.274 "name": "Existed_Raid", 00:20:33.274 "uuid": "2c4e8271-08ed-4871-afc1-8d810f950d32", 00:20:33.274 "strip_size_kb": 0, 00:20:33.274 "state": "online", 00:20:33.274 "raid_level": "raid1", 00:20:33.274 "superblock": false, 00:20:33.274 "num_base_bdevs": 4, 00:20:33.274 "num_base_bdevs_discovered": 4, 00:20:33.274 "num_base_bdevs_operational": 4, 00:20:33.274 "base_bdevs_list": [ 00:20:33.274 { 00:20:33.274 "name": "BaseBdev1", 00:20:33.274 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:33.274 "is_configured": true, 00:20:33.274 "data_offset": 0, 00:20:33.274 "data_size": 65536 00:20:33.274 }, 00:20:33.274 { 00:20:33.274 "name": "BaseBdev2", 00:20:33.274 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:33.274 "is_configured": true, 00:20:33.274 "data_offset": 0, 00:20:33.274 "data_size": 65536 00:20:33.274 }, 00:20:33.274 { 00:20:33.274 "name": "BaseBdev3", 00:20:33.274 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:33.274 "is_configured": true, 00:20:33.274 "data_offset": 0, 00:20:33.274 "data_size": 65536 00:20:33.274 }, 00:20:33.274 { 00:20:33.274 "name": "BaseBdev4", 00:20:33.274 "uuid": "ca3968aa-9c00-4c09-b8f4-71fafe47ce08", 00:20:33.274 "is_configured": true, 00:20:33.274 "data_offset": 0, 00:20:33.274 "data_size": 65536 00:20:33.274 } 00:20:33.274 ] 00:20:33.274 }' 00:20:33.274 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.274 13:39:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:33.838 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:33.838 [2024-07-15 13:39:13.254798] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:34.095 "name": "Existed_Raid", 00:20:34.095 "aliases": [ 00:20:34.095 "2c4e8271-08ed-4871-afc1-8d810f950d32" 00:20:34.095 ], 00:20:34.095 "product_name": "Raid Volume", 00:20:34.095 "block_size": 512, 00:20:34.095 "num_blocks": 65536, 00:20:34.095 "uuid": "2c4e8271-08ed-4871-afc1-8d810f950d32", 00:20:34.095 "assigned_rate_limits": { 00:20:34.095 "rw_ios_per_sec": 0, 00:20:34.095 "rw_mbytes_per_sec": 0, 00:20:34.095 "r_mbytes_per_sec": 0, 00:20:34.095 "w_mbytes_per_sec": 0 00:20:34.095 }, 00:20:34.095 "claimed": false, 00:20:34.095 "zoned": false, 00:20:34.095 "supported_io_types": { 00:20:34.095 "read": true, 00:20:34.095 "write": true, 00:20:34.095 "unmap": false, 00:20:34.095 "flush": false, 00:20:34.095 "reset": true, 00:20:34.095 "nvme_admin": false, 00:20:34.095 "nvme_io": false, 00:20:34.095 "nvme_io_md": false, 00:20:34.095 "write_zeroes": true, 00:20:34.095 "zcopy": false, 00:20:34.095 "get_zone_info": false, 00:20:34.095 "zone_management": false, 00:20:34.095 "zone_append": false, 00:20:34.095 "compare": false, 00:20:34.095 "compare_and_write": false, 00:20:34.095 "abort": false, 00:20:34.095 "seek_hole": false, 00:20:34.095 "seek_data": false, 00:20:34.095 "copy": false, 00:20:34.095 "nvme_iov_md": false 00:20:34.095 }, 00:20:34.095 "memory_domains": [ 00:20:34.095 { 00:20:34.095 "dma_device_id": "system", 00:20:34.095 "dma_device_type": 1 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.095 "dma_device_type": 2 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "system", 00:20:34.095 "dma_device_type": 1 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.095 "dma_device_type": 2 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "system", 00:20:34.095 "dma_device_type": 1 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.095 "dma_device_type": 2 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "system", 00:20:34.095 "dma_device_type": 1 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.095 "dma_device_type": 2 00:20:34.095 } 00:20:34.095 ], 00:20:34.095 "driver_specific": { 00:20:34.095 "raid": { 00:20:34.095 "uuid": "2c4e8271-08ed-4871-afc1-8d810f950d32", 00:20:34.095 "strip_size_kb": 0, 00:20:34.095 "state": "online", 00:20:34.095 "raid_level": "raid1", 00:20:34.095 "superblock": false, 00:20:34.095 "num_base_bdevs": 4, 00:20:34.095 "num_base_bdevs_discovered": 4, 00:20:34.095 "num_base_bdevs_operational": 4, 00:20:34.095 "base_bdevs_list": [ 00:20:34.095 { 00:20:34.095 "name": "BaseBdev1", 00:20:34.095 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:34.095 "is_configured": true, 00:20:34.095 "data_offset": 0, 00:20:34.095 "data_size": 65536 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "name": "BaseBdev2", 00:20:34.095 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:34.095 "is_configured": true, 00:20:34.095 "data_offset": 0, 00:20:34.095 "data_size": 65536 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "name": "BaseBdev3", 00:20:34.095 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:34.095 "is_configured": true, 00:20:34.095 "data_offset": 0, 00:20:34.095 "data_size": 65536 00:20:34.095 }, 00:20:34.095 { 00:20:34.095 "name": "BaseBdev4", 00:20:34.095 "uuid": "ca3968aa-9c00-4c09-b8f4-71fafe47ce08", 00:20:34.095 "is_configured": true, 00:20:34.095 "data_offset": 0, 00:20:34.095 "data_size": 65536 00:20:34.095 } 00:20:34.095 ] 00:20:34.095 } 00:20:34.095 } 00:20:34.095 }' 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:34.095 BaseBdev2 00:20:34.095 BaseBdev3 00:20:34.095 BaseBdev4' 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:34.095 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.351 "name": "BaseBdev1", 00:20:34.351 "aliases": [ 00:20:34.351 "02ad51a4-09af-4451-bd9a-f77f1af67055" 00:20:34.351 ], 00:20:34.351 "product_name": "Malloc disk", 00:20:34.351 "block_size": 512, 00:20:34.351 "num_blocks": 65536, 00:20:34.351 "uuid": "02ad51a4-09af-4451-bd9a-f77f1af67055", 00:20:34.351 "assigned_rate_limits": { 00:20:34.351 "rw_ios_per_sec": 0, 00:20:34.351 "rw_mbytes_per_sec": 0, 00:20:34.351 "r_mbytes_per_sec": 0, 00:20:34.351 "w_mbytes_per_sec": 0 00:20:34.351 }, 00:20:34.351 "claimed": true, 00:20:34.351 "claim_type": "exclusive_write", 00:20:34.351 "zoned": false, 00:20:34.351 "supported_io_types": { 00:20:34.351 "read": true, 00:20:34.351 "write": true, 00:20:34.351 "unmap": true, 00:20:34.351 "flush": true, 00:20:34.351 "reset": true, 00:20:34.351 "nvme_admin": false, 00:20:34.351 "nvme_io": false, 00:20:34.351 "nvme_io_md": false, 00:20:34.351 "write_zeroes": true, 00:20:34.351 "zcopy": true, 00:20:34.351 "get_zone_info": false, 00:20:34.351 "zone_management": false, 00:20:34.351 "zone_append": false, 00:20:34.351 "compare": false, 00:20:34.351 "compare_and_write": false, 00:20:34.351 "abort": true, 00:20:34.351 "seek_hole": false, 00:20:34.351 "seek_data": false, 00:20:34.351 "copy": true, 00:20:34.351 "nvme_iov_md": false 00:20:34.351 }, 00:20:34.351 "memory_domains": [ 00:20:34.351 { 00:20:34.351 "dma_device_id": "system", 00:20:34.351 "dma_device_type": 1 00:20:34.351 }, 00:20:34.351 { 00:20:34.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.351 "dma_device_type": 2 00:20:34.351 } 00:20:34.351 ], 00:20:34.351 "driver_specific": {} 00:20:34.351 }' 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.351 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:34.607 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.864 "name": "BaseBdev2", 00:20:34.864 "aliases": [ 00:20:34.864 "96bba3e4-4f76-47f3-ab57-9155e28e5a5b" 00:20:34.864 ], 00:20:34.864 "product_name": "Malloc disk", 00:20:34.864 "block_size": 512, 00:20:34.864 "num_blocks": 65536, 00:20:34.864 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:34.864 "assigned_rate_limits": { 00:20:34.864 "rw_ios_per_sec": 0, 00:20:34.864 "rw_mbytes_per_sec": 0, 00:20:34.864 "r_mbytes_per_sec": 0, 00:20:34.864 "w_mbytes_per_sec": 0 00:20:34.864 }, 00:20:34.864 "claimed": true, 00:20:34.864 "claim_type": "exclusive_write", 00:20:34.864 "zoned": false, 00:20:34.864 "supported_io_types": { 00:20:34.864 "read": true, 00:20:34.864 "write": true, 00:20:34.864 "unmap": true, 00:20:34.864 "flush": true, 00:20:34.864 "reset": true, 00:20:34.864 "nvme_admin": false, 00:20:34.864 "nvme_io": false, 00:20:34.864 "nvme_io_md": false, 00:20:34.864 "write_zeroes": true, 00:20:34.864 "zcopy": true, 00:20:34.864 "get_zone_info": false, 00:20:34.864 "zone_management": false, 00:20:34.864 "zone_append": false, 00:20:34.864 "compare": false, 00:20:34.864 "compare_and_write": false, 00:20:34.864 "abort": true, 00:20:34.864 "seek_hole": false, 00:20:34.864 "seek_data": false, 00:20:34.864 "copy": true, 00:20:34.864 "nvme_iov_md": false 00:20:34.864 }, 00:20:34.864 "memory_domains": [ 00:20:34.864 { 00:20:34.864 "dma_device_id": "system", 00:20:34.864 "dma_device_type": 1 00:20:34.864 }, 00:20:34.864 { 00:20:34.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.864 "dma_device_type": 2 00:20:34.864 } 00:20:34.864 ], 00:20:34.864 "driver_specific": {} 00:20:34.864 }' 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.864 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.121 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.121 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:35.122 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.378 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.378 "name": "BaseBdev3", 00:20:35.378 "aliases": [ 00:20:35.378 "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f" 00:20:35.378 ], 00:20:35.378 "product_name": "Malloc disk", 00:20:35.378 "block_size": 512, 00:20:35.378 "num_blocks": 65536, 00:20:35.378 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:35.378 "assigned_rate_limits": { 00:20:35.378 "rw_ios_per_sec": 0, 00:20:35.378 "rw_mbytes_per_sec": 0, 00:20:35.378 "r_mbytes_per_sec": 0, 00:20:35.378 "w_mbytes_per_sec": 0 00:20:35.378 }, 00:20:35.378 "claimed": true, 00:20:35.378 "claim_type": "exclusive_write", 00:20:35.378 "zoned": false, 00:20:35.378 "supported_io_types": { 00:20:35.378 "read": true, 00:20:35.378 "write": true, 00:20:35.378 "unmap": true, 00:20:35.378 "flush": true, 00:20:35.378 "reset": true, 00:20:35.378 "nvme_admin": false, 00:20:35.378 "nvme_io": false, 00:20:35.378 "nvme_io_md": false, 00:20:35.378 "write_zeroes": true, 00:20:35.378 "zcopy": true, 00:20:35.378 "get_zone_info": false, 00:20:35.378 "zone_management": false, 00:20:35.378 "zone_append": false, 00:20:35.378 "compare": false, 00:20:35.378 "compare_and_write": false, 00:20:35.378 "abort": true, 00:20:35.378 "seek_hole": false, 00:20:35.378 "seek_data": false, 00:20:35.378 "copy": true, 00:20:35.378 "nvme_iov_md": false 00:20:35.378 }, 00:20:35.378 "memory_domains": [ 00:20:35.378 { 00:20:35.378 "dma_device_id": "system", 00:20:35.378 "dma_device_type": 1 00:20:35.378 }, 00:20:35.378 { 00:20:35.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.378 "dma_device_type": 2 00:20:35.378 } 00:20:35.378 ], 00:20:35.378 "driver_specific": {} 00:20:35.378 }' 00:20:35.378 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.378 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.378 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:35.378 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.635 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.635 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.635 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.635 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:35.635 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:35.635 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.892 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.892 "name": "BaseBdev4", 00:20:35.892 "aliases": [ 00:20:35.892 "ca3968aa-9c00-4c09-b8f4-71fafe47ce08" 00:20:35.892 ], 00:20:35.892 "product_name": "Malloc disk", 00:20:35.892 "block_size": 512, 00:20:35.892 "num_blocks": 65536, 00:20:35.892 "uuid": "ca3968aa-9c00-4c09-b8f4-71fafe47ce08", 00:20:35.892 "assigned_rate_limits": { 00:20:35.892 "rw_ios_per_sec": 0, 00:20:35.892 "rw_mbytes_per_sec": 0, 00:20:35.892 "r_mbytes_per_sec": 0, 00:20:35.892 "w_mbytes_per_sec": 0 00:20:35.892 }, 00:20:35.892 "claimed": true, 00:20:35.892 "claim_type": "exclusive_write", 00:20:35.892 "zoned": false, 00:20:35.892 "supported_io_types": { 00:20:35.892 "read": true, 00:20:35.892 "write": true, 00:20:35.892 "unmap": true, 00:20:35.892 "flush": true, 00:20:35.892 "reset": true, 00:20:35.892 "nvme_admin": false, 00:20:35.892 "nvme_io": false, 00:20:35.892 "nvme_io_md": false, 00:20:35.892 "write_zeroes": true, 00:20:35.892 "zcopy": true, 00:20:35.892 "get_zone_info": false, 00:20:35.892 "zone_management": false, 00:20:35.892 "zone_append": false, 00:20:35.892 "compare": false, 00:20:35.892 "compare_and_write": false, 00:20:35.892 "abort": true, 00:20:35.892 "seek_hole": false, 00:20:35.892 "seek_data": false, 00:20:35.892 "copy": true, 00:20:35.892 "nvme_iov_md": false 00:20:35.892 }, 00:20:35.892 "memory_domains": [ 00:20:35.892 { 00:20:35.892 "dma_device_id": "system", 00:20:35.892 "dma_device_type": 1 00:20:35.892 }, 00:20:35.892 { 00:20:35.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.892 "dma_device_type": 2 00:20:35.892 } 00:20:35.892 ], 00:20:35.892 "driver_specific": {} 00:20:35.892 }' 00:20:35.892 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:36.149 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.406 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.406 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:36.406 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:36.663 [2024-07-15 13:39:15.869462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.663 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.920 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.920 "name": "Existed_Raid", 00:20:36.920 "uuid": "2c4e8271-08ed-4871-afc1-8d810f950d32", 00:20:36.920 "strip_size_kb": 0, 00:20:36.920 "state": "online", 00:20:36.920 "raid_level": "raid1", 00:20:36.920 "superblock": false, 00:20:36.920 "num_base_bdevs": 4, 00:20:36.920 "num_base_bdevs_discovered": 3, 00:20:36.920 "num_base_bdevs_operational": 3, 00:20:36.920 "base_bdevs_list": [ 00:20:36.920 { 00:20:36.920 "name": null, 00:20:36.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.920 "is_configured": false, 00:20:36.920 "data_offset": 0, 00:20:36.920 "data_size": 65536 00:20:36.920 }, 00:20:36.920 { 00:20:36.920 "name": "BaseBdev2", 00:20:36.920 "uuid": "96bba3e4-4f76-47f3-ab57-9155e28e5a5b", 00:20:36.920 "is_configured": true, 00:20:36.920 "data_offset": 0, 00:20:36.920 "data_size": 65536 00:20:36.920 }, 00:20:36.920 { 00:20:36.920 "name": "BaseBdev3", 00:20:36.920 "uuid": "6c11a4c7-18f6-4bc4-9ec1-cd2077a3814f", 00:20:36.920 "is_configured": true, 00:20:36.920 "data_offset": 0, 00:20:36.920 "data_size": 65536 00:20:36.920 }, 00:20:36.920 { 00:20:36.920 "name": "BaseBdev4", 00:20:36.920 "uuid": "ca3968aa-9c00-4c09-b8f4-71fafe47ce08", 00:20:36.920 "is_configured": true, 00:20:36.920 "data_offset": 0, 00:20:36.920 "data_size": 65536 00:20:36.920 } 00:20:36.920 ] 00:20:36.920 }' 00:20:36.920 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.920 13:39:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.530 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:37.530 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:37.530 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.530 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:37.788 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:37.788 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:37.788 13:39:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:37.788 [2024-07-15 13:39:17.202480] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:38.048 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:38.048 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:38.048 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.048 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:38.307 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:38.307 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:38.307 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:38.307 [2024-07-15 13:39:17.706536] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:38.565 13:39:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:38.823 [2024-07-15 13:39:18.214426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:38.823 [2024-07-15 13:39:18.214509] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:38.823 [2024-07-15 13:39:18.226989] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:38.823 [2024-07-15 13:39:18.227024] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:38.823 [2024-07-15 13:39:18.227036] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xccf350 name Existed_Raid, state offline 00:20:38.823 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:38.823 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:39.082 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:39.340 BaseBdev2 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:39.340 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.599 13:39:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:39.858 [ 00:20:39.858 { 00:20:39.858 "name": "BaseBdev2", 00:20:39.858 "aliases": [ 00:20:39.858 "f3bd5e65-3eb2-4760-bbdc-3e6781a77894" 00:20:39.858 ], 00:20:39.858 "product_name": "Malloc disk", 00:20:39.858 "block_size": 512, 00:20:39.858 "num_blocks": 65536, 00:20:39.858 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:39.858 "assigned_rate_limits": { 00:20:39.858 "rw_ios_per_sec": 0, 00:20:39.858 "rw_mbytes_per_sec": 0, 00:20:39.858 "r_mbytes_per_sec": 0, 00:20:39.858 "w_mbytes_per_sec": 0 00:20:39.858 }, 00:20:39.858 "claimed": false, 00:20:39.858 "zoned": false, 00:20:39.858 "supported_io_types": { 00:20:39.858 "read": true, 00:20:39.858 "write": true, 00:20:39.858 "unmap": true, 00:20:39.858 "flush": true, 00:20:39.858 "reset": true, 00:20:39.858 "nvme_admin": false, 00:20:39.858 "nvme_io": false, 00:20:39.858 "nvme_io_md": false, 00:20:39.858 "write_zeroes": true, 00:20:39.858 "zcopy": true, 00:20:39.858 "get_zone_info": false, 00:20:39.858 "zone_management": false, 00:20:39.858 "zone_append": false, 00:20:39.858 "compare": false, 00:20:39.858 "compare_and_write": false, 00:20:39.858 "abort": true, 00:20:39.858 "seek_hole": false, 00:20:39.858 "seek_data": false, 00:20:39.858 "copy": true, 00:20:39.858 "nvme_iov_md": false 00:20:39.858 }, 00:20:39.858 "memory_domains": [ 00:20:39.858 { 00:20:39.858 "dma_device_id": "system", 00:20:39.858 "dma_device_type": 1 00:20:39.858 }, 00:20:39.858 { 00:20:39.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.858 "dma_device_type": 2 00:20:39.858 } 00:20:39.858 ], 00:20:39.858 "driver_specific": {} 00:20:39.858 } 00:20:39.858 ] 00:20:39.858 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:39.858 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:39.858 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:39.858 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:40.116 BaseBdev3 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:40.116 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:40.374 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:40.635 [ 00:20:40.635 { 00:20:40.635 "name": "BaseBdev3", 00:20:40.635 "aliases": [ 00:20:40.635 "6caf9cff-5811-4148-8392-bd2067457a25" 00:20:40.635 ], 00:20:40.635 "product_name": "Malloc disk", 00:20:40.635 "block_size": 512, 00:20:40.635 "num_blocks": 65536, 00:20:40.635 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:40.635 "assigned_rate_limits": { 00:20:40.635 "rw_ios_per_sec": 0, 00:20:40.635 "rw_mbytes_per_sec": 0, 00:20:40.635 "r_mbytes_per_sec": 0, 00:20:40.635 "w_mbytes_per_sec": 0 00:20:40.635 }, 00:20:40.635 "claimed": false, 00:20:40.635 "zoned": false, 00:20:40.635 "supported_io_types": { 00:20:40.635 "read": true, 00:20:40.635 "write": true, 00:20:40.635 "unmap": true, 00:20:40.635 "flush": true, 00:20:40.635 "reset": true, 00:20:40.635 "nvme_admin": false, 00:20:40.635 "nvme_io": false, 00:20:40.635 "nvme_io_md": false, 00:20:40.635 "write_zeroes": true, 00:20:40.635 "zcopy": true, 00:20:40.635 "get_zone_info": false, 00:20:40.635 "zone_management": false, 00:20:40.635 "zone_append": false, 00:20:40.635 "compare": false, 00:20:40.635 "compare_and_write": false, 00:20:40.635 "abort": true, 00:20:40.635 "seek_hole": false, 00:20:40.635 "seek_data": false, 00:20:40.635 "copy": true, 00:20:40.635 "nvme_iov_md": false 00:20:40.635 }, 00:20:40.635 "memory_domains": [ 00:20:40.635 { 00:20:40.635 "dma_device_id": "system", 00:20:40.635 "dma_device_type": 1 00:20:40.635 }, 00:20:40.635 { 00:20:40.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.635 "dma_device_type": 2 00:20:40.635 } 00:20:40.635 ], 00:20:40.635 "driver_specific": {} 00:20:40.635 } 00:20:40.635 ] 00:20:40.635 13:39:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:40.635 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:40.635 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:40.635 13:39:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:40.893 BaseBdev4 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:40.893 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:41.152 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:41.411 [ 00:20:41.411 { 00:20:41.411 "name": "BaseBdev4", 00:20:41.411 "aliases": [ 00:20:41.411 "5117ff43-eda2-41d0-afa9-d1ada0511474" 00:20:41.411 ], 00:20:41.411 "product_name": "Malloc disk", 00:20:41.411 "block_size": 512, 00:20:41.411 "num_blocks": 65536, 00:20:41.411 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:41.411 "assigned_rate_limits": { 00:20:41.411 "rw_ios_per_sec": 0, 00:20:41.411 "rw_mbytes_per_sec": 0, 00:20:41.411 "r_mbytes_per_sec": 0, 00:20:41.411 "w_mbytes_per_sec": 0 00:20:41.411 }, 00:20:41.411 "claimed": false, 00:20:41.411 "zoned": false, 00:20:41.411 "supported_io_types": { 00:20:41.411 "read": true, 00:20:41.411 "write": true, 00:20:41.411 "unmap": true, 00:20:41.411 "flush": true, 00:20:41.411 "reset": true, 00:20:41.411 "nvme_admin": false, 00:20:41.411 "nvme_io": false, 00:20:41.411 "nvme_io_md": false, 00:20:41.411 "write_zeroes": true, 00:20:41.411 "zcopy": true, 00:20:41.411 "get_zone_info": false, 00:20:41.411 "zone_management": false, 00:20:41.411 "zone_append": false, 00:20:41.411 "compare": false, 00:20:41.411 "compare_and_write": false, 00:20:41.411 "abort": true, 00:20:41.411 "seek_hole": false, 00:20:41.411 "seek_data": false, 00:20:41.411 "copy": true, 00:20:41.411 "nvme_iov_md": false 00:20:41.411 }, 00:20:41.411 "memory_domains": [ 00:20:41.411 { 00:20:41.411 "dma_device_id": "system", 00:20:41.411 "dma_device_type": 1 00:20:41.411 }, 00:20:41.411 { 00:20:41.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.411 "dma_device_type": 2 00:20:41.411 } 00:20:41.411 ], 00:20:41.411 "driver_specific": {} 00:20:41.411 } 00:20:41.411 ] 00:20:41.411 13:39:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:41.411 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:41.411 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:41.411 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:41.670 [2024-07-15 13:39:20.944536] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:41.670 [2024-07-15 13:39:20.944578] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:41.670 [2024-07-15 13:39:20.944599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:41.670 [2024-07-15 13:39:20.945977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:41.670 [2024-07-15 13:39:20.946017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.670 13:39:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.928 13:39:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.928 "name": "Existed_Raid", 00:20:41.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.928 "strip_size_kb": 0, 00:20:41.928 "state": "configuring", 00:20:41.928 "raid_level": "raid1", 00:20:41.928 "superblock": false, 00:20:41.928 "num_base_bdevs": 4, 00:20:41.928 "num_base_bdevs_discovered": 3, 00:20:41.928 "num_base_bdevs_operational": 4, 00:20:41.928 "base_bdevs_list": [ 00:20:41.928 { 00:20:41.928 "name": "BaseBdev1", 00:20:41.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.928 "is_configured": false, 00:20:41.928 "data_offset": 0, 00:20:41.928 "data_size": 0 00:20:41.928 }, 00:20:41.928 { 00:20:41.928 "name": "BaseBdev2", 00:20:41.928 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:41.928 "is_configured": true, 00:20:41.928 "data_offset": 0, 00:20:41.928 "data_size": 65536 00:20:41.928 }, 00:20:41.928 { 00:20:41.928 "name": "BaseBdev3", 00:20:41.928 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:41.928 "is_configured": true, 00:20:41.928 "data_offset": 0, 00:20:41.928 "data_size": 65536 00:20:41.928 }, 00:20:41.928 { 00:20:41.928 "name": "BaseBdev4", 00:20:41.928 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:41.928 "is_configured": true, 00:20:41.928 "data_offset": 0, 00:20:41.928 "data_size": 65536 00:20:41.928 } 00:20:41.928 ] 00:20:41.928 }' 00:20:41.928 13:39:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.928 13:39:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.495 13:39:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:42.753 [2024-07-15 13:39:22.019363] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.753 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:43.017 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.017 "name": "Existed_Raid", 00:20:43.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.017 "strip_size_kb": 0, 00:20:43.017 "state": "configuring", 00:20:43.017 "raid_level": "raid1", 00:20:43.017 "superblock": false, 00:20:43.017 "num_base_bdevs": 4, 00:20:43.017 "num_base_bdevs_discovered": 2, 00:20:43.017 "num_base_bdevs_operational": 4, 00:20:43.017 "base_bdevs_list": [ 00:20:43.017 { 00:20:43.017 "name": "BaseBdev1", 00:20:43.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.017 "is_configured": false, 00:20:43.017 "data_offset": 0, 00:20:43.017 "data_size": 0 00:20:43.017 }, 00:20:43.017 { 00:20:43.017 "name": null, 00:20:43.017 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:43.017 "is_configured": false, 00:20:43.017 "data_offset": 0, 00:20:43.017 "data_size": 65536 00:20:43.017 }, 00:20:43.017 { 00:20:43.017 "name": "BaseBdev3", 00:20:43.017 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:43.017 "is_configured": true, 00:20:43.017 "data_offset": 0, 00:20:43.017 "data_size": 65536 00:20:43.017 }, 00:20:43.017 { 00:20:43.017 "name": "BaseBdev4", 00:20:43.017 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:43.017 "is_configured": true, 00:20:43.017 "data_offset": 0, 00:20:43.017 "data_size": 65536 00:20:43.017 } 00:20:43.017 ] 00:20:43.017 }' 00:20:43.017 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.017 13:39:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.589 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.589 13:39:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:43.847 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:43.847 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:44.105 [2024-07-15 13:39:23.286205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:44.105 BaseBdev1 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:44.105 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:44.363 [ 00:20:44.363 { 00:20:44.363 "name": "BaseBdev1", 00:20:44.363 "aliases": [ 00:20:44.363 "a08479cb-30e7-491e-b1f4-4ecce88674c8" 00:20:44.363 ], 00:20:44.363 "product_name": "Malloc disk", 00:20:44.363 "block_size": 512, 00:20:44.363 "num_blocks": 65536, 00:20:44.363 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:44.363 "assigned_rate_limits": { 00:20:44.363 "rw_ios_per_sec": 0, 00:20:44.363 "rw_mbytes_per_sec": 0, 00:20:44.363 "r_mbytes_per_sec": 0, 00:20:44.363 "w_mbytes_per_sec": 0 00:20:44.363 }, 00:20:44.363 "claimed": true, 00:20:44.363 "claim_type": "exclusive_write", 00:20:44.363 "zoned": false, 00:20:44.363 "supported_io_types": { 00:20:44.363 "read": true, 00:20:44.363 "write": true, 00:20:44.363 "unmap": true, 00:20:44.363 "flush": true, 00:20:44.363 "reset": true, 00:20:44.363 "nvme_admin": false, 00:20:44.363 "nvme_io": false, 00:20:44.363 "nvme_io_md": false, 00:20:44.363 "write_zeroes": true, 00:20:44.363 "zcopy": true, 00:20:44.363 "get_zone_info": false, 00:20:44.363 "zone_management": false, 00:20:44.363 "zone_append": false, 00:20:44.363 "compare": false, 00:20:44.363 "compare_and_write": false, 00:20:44.363 "abort": true, 00:20:44.363 "seek_hole": false, 00:20:44.363 "seek_data": false, 00:20:44.363 "copy": true, 00:20:44.363 "nvme_iov_md": false 00:20:44.363 }, 00:20:44.363 "memory_domains": [ 00:20:44.363 { 00:20:44.363 "dma_device_id": "system", 00:20:44.363 "dma_device_type": 1 00:20:44.363 }, 00:20:44.363 { 00:20:44.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.363 "dma_device_type": 2 00:20:44.363 } 00:20:44.363 ], 00:20:44.363 "driver_specific": {} 00:20:44.363 } 00:20:44.363 ] 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.363 13:39:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.620 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.620 "name": "Existed_Raid", 00:20:44.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.621 "strip_size_kb": 0, 00:20:44.621 "state": "configuring", 00:20:44.621 "raid_level": "raid1", 00:20:44.621 "superblock": false, 00:20:44.621 "num_base_bdevs": 4, 00:20:44.621 "num_base_bdevs_discovered": 3, 00:20:44.621 "num_base_bdevs_operational": 4, 00:20:44.621 "base_bdevs_list": [ 00:20:44.621 { 00:20:44.621 "name": "BaseBdev1", 00:20:44.621 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:44.621 "is_configured": true, 00:20:44.621 "data_offset": 0, 00:20:44.621 "data_size": 65536 00:20:44.621 }, 00:20:44.621 { 00:20:44.621 "name": null, 00:20:44.621 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:44.621 "is_configured": false, 00:20:44.621 "data_offset": 0, 00:20:44.621 "data_size": 65536 00:20:44.621 }, 00:20:44.621 { 00:20:44.621 "name": "BaseBdev3", 00:20:44.621 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:44.621 "is_configured": true, 00:20:44.621 "data_offset": 0, 00:20:44.621 "data_size": 65536 00:20:44.621 }, 00:20:44.621 { 00:20:44.621 "name": "BaseBdev4", 00:20:44.621 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:44.621 "is_configured": true, 00:20:44.621 "data_offset": 0, 00:20:44.621 "data_size": 65536 00:20:44.621 } 00:20:44.621 ] 00:20:44.621 }' 00:20:44.621 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.621 13:39:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.229 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.229 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:45.486 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:45.486 13:39:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:45.743 [2024-07-15 13:39:25.026867] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.743 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.998 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.998 "name": "Existed_Raid", 00:20:45.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.998 "strip_size_kb": 0, 00:20:45.998 "state": "configuring", 00:20:45.998 "raid_level": "raid1", 00:20:45.998 "superblock": false, 00:20:45.998 "num_base_bdevs": 4, 00:20:45.998 "num_base_bdevs_discovered": 2, 00:20:45.998 "num_base_bdevs_operational": 4, 00:20:45.998 "base_bdevs_list": [ 00:20:45.998 { 00:20:45.998 "name": "BaseBdev1", 00:20:45.998 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:45.998 "is_configured": true, 00:20:45.998 "data_offset": 0, 00:20:45.998 "data_size": 65536 00:20:45.998 }, 00:20:45.998 { 00:20:45.998 "name": null, 00:20:45.998 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:45.998 "is_configured": false, 00:20:45.998 "data_offset": 0, 00:20:45.998 "data_size": 65536 00:20:45.998 }, 00:20:45.998 { 00:20:45.998 "name": null, 00:20:45.998 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:45.998 "is_configured": false, 00:20:45.998 "data_offset": 0, 00:20:45.998 "data_size": 65536 00:20:45.998 }, 00:20:45.998 { 00:20:45.998 "name": "BaseBdev4", 00:20:45.998 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:45.998 "is_configured": true, 00:20:45.998 "data_offset": 0, 00:20:45.998 "data_size": 65536 00:20:45.998 } 00:20:45.998 ] 00:20:45.998 }' 00:20:45.998 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.998 13:39:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:46.560 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.560 13:39:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:46.895 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:46.895 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:47.185 [2024-07-15 13:39:26.374532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.185 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.449 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.449 "name": "Existed_Raid", 00:20:47.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.449 "strip_size_kb": 0, 00:20:47.449 "state": "configuring", 00:20:47.449 "raid_level": "raid1", 00:20:47.449 "superblock": false, 00:20:47.449 "num_base_bdevs": 4, 00:20:47.449 "num_base_bdevs_discovered": 3, 00:20:47.449 "num_base_bdevs_operational": 4, 00:20:47.449 "base_bdevs_list": [ 00:20:47.449 { 00:20:47.449 "name": "BaseBdev1", 00:20:47.449 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:47.449 "is_configured": true, 00:20:47.449 "data_offset": 0, 00:20:47.449 "data_size": 65536 00:20:47.449 }, 00:20:47.449 { 00:20:47.449 "name": null, 00:20:47.449 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:47.449 "is_configured": false, 00:20:47.449 "data_offset": 0, 00:20:47.449 "data_size": 65536 00:20:47.449 }, 00:20:47.449 { 00:20:47.449 "name": "BaseBdev3", 00:20:47.449 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:47.449 "is_configured": true, 00:20:47.449 "data_offset": 0, 00:20:47.449 "data_size": 65536 00:20:47.449 }, 00:20:47.449 { 00:20:47.449 "name": "BaseBdev4", 00:20:47.449 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:47.449 "is_configured": true, 00:20:47.449 "data_offset": 0, 00:20:47.449 "data_size": 65536 00:20:47.449 } 00:20:47.449 ] 00:20:47.449 }' 00:20:47.449 13:39:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.449 13:39:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:48.014 13:39:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.014 13:39:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:48.272 13:39:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:48.272 13:39:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:48.837 [2024-07-15 13:39:27.974792] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.837 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.095 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.095 "name": "Existed_Raid", 00:20:49.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.095 "strip_size_kb": 0, 00:20:49.095 "state": "configuring", 00:20:49.095 "raid_level": "raid1", 00:20:49.095 "superblock": false, 00:20:49.095 "num_base_bdevs": 4, 00:20:49.095 "num_base_bdevs_discovered": 2, 00:20:49.095 "num_base_bdevs_operational": 4, 00:20:49.095 "base_bdevs_list": [ 00:20:49.095 { 00:20:49.095 "name": null, 00:20:49.095 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:49.095 "is_configured": false, 00:20:49.095 "data_offset": 0, 00:20:49.095 "data_size": 65536 00:20:49.095 }, 00:20:49.095 { 00:20:49.095 "name": null, 00:20:49.095 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:49.095 "is_configured": false, 00:20:49.095 "data_offset": 0, 00:20:49.095 "data_size": 65536 00:20:49.095 }, 00:20:49.095 { 00:20:49.095 "name": "BaseBdev3", 00:20:49.095 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:49.095 "is_configured": true, 00:20:49.095 "data_offset": 0, 00:20:49.095 "data_size": 65536 00:20:49.095 }, 00:20:49.095 { 00:20:49.095 "name": "BaseBdev4", 00:20:49.095 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:49.095 "is_configured": true, 00:20:49.095 "data_offset": 0, 00:20:49.095 "data_size": 65536 00:20:49.095 } 00:20:49.095 ] 00:20:49.095 }' 00:20:49.095 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.095 13:39:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.660 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.660 13:39:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:49.917 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:49.917 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:50.174 [2024-07-15 13:39:29.347001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.174 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.432 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.432 "name": "Existed_Raid", 00:20:50.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.432 "strip_size_kb": 0, 00:20:50.432 "state": "configuring", 00:20:50.432 "raid_level": "raid1", 00:20:50.432 "superblock": false, 00:20:50.432 "num_base_bdevs": 4, 00:20:50.432 "num_base_bdevs_discovered": 3, 00:20:50.432 "num_base_bdevs_operational": 4, 00:20:50.432 "base_bdevs_list": [ 00:20:50.432 { 00:20:50.432 "name": null, 00:20:50.432 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:50.432 "is_configured": false, 00:20:50.432 "data_offset": 0, 00:20:50.432 "data_size": 65536 00:20:50.432 }, 00:20:50.432 { 00:20:50.432 "name": "BaseBdev2", 00:20:50.432 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:50.432 "is_configured": true, 00:20:50.432 "data_offset": 0, 00:20:50.432 "data_size": 65536 00:20:50.432 }, 00:20:50.432 { 00:20:50.432 "name": "BaseBdev3", 00:20:50.432 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:50.432 "is_configured": true, 00:20:50.432 "data_offset": 0, 00:20:50.432 "data_size": 65536 00:20:50.432 }, 00:20:50.432 { 00:20:50.432 "name": "BaseBdev4", 00:20:50.432 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:50.432 "is_configured": true, 00:20:50.432 "data_offset": 0, 00:20:50.432 "data_size": 65536 00:20:50.432 } 00:20:50.432 ] 00:20:50.432 }' 00:20:50.432 13:39:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.432 13:39:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.997 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:50.997 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.255 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:51.255 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.255 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:51.513 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a08479cb-30e7-491e-b1f4-4ecce88674c8 00:20:51.796 [2024-07-15 13:39:30.942700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:51.796 [2024-07-15 13:39:30.942741] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xccd610 00:20:51.796 [2024-07-15 13:39:30.942750] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:51.796 [2024-07-15 13:39:30.942950] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccea70 00:20:51.796 [2024-07-15 13:39:30.943074] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xccd610 00:20:51.796 [2024-07-15 13:39:30.943084] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xccd610 00:20:51.796 [2024-07-15 13:39:30.943249] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:51.796 NewBaseBdev 00:20:51.796 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:51.796 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:51.796 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:51.797 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:51.797 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:51.797 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:51.797 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.797 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:52.053 [ 00:20:52.053 { 00:20:52.053 "name": "NewBaseBdev", 00:20:52.053 "aliases": [ 00:20:52.053 "a08479cb-30e7-491e-b1f4-4ecce88674c8" 00:20:52.053 ], 00:20:52.053 "product_name": "Malloc disk", 00:20:52.053 "block_size": 512, 00:20:52.053 "num_blocks": 65536, 00:20:52.053 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:52.053 "assigned_rate_limits": { 00:20:52.053 "rw_ios_per_sec": 0, 00:20:52.053 "rw_mbytes_per_sec": 0, 00:20:52.053 "r_mbytes_per_sec": 0, 00:20:52.053 "w_mbytes_per_sec": 0 00:20:52.053 }, 00:20:52.053 "claimed": true, 00:20:52.053 "claim_type": "exclusive_write", 00:20:52.053 "zoned": false, 00:20:52.053 "supported_io_types": { 00:20:52.053 "read": true, 00:20:52.053 "write": true, 00:20:52.053 "unmap": true, 00:20:52.053 "flush": true, 00:20:52.053 "reset": true, 00:20:52.053 "nvme_admin": false, 00:20:52.053 "nvme_io": false, 00:20:52.053 "nvme_io_md": false, 00:20:52.054 "write_zeroes": true, 00:20:52.054 "zcopy": true, 00:20:52.054 "get_zone_info": false, 00:20:52.054 "zone_management": false, 00:20:52.054 "zone_append": false, 00:20:52.054 "compare": false, 00:20:52.054 "compare_and_write": false, 00:20:52.054 "abort": true, 00:20:52.054 "seek_hole": false, 00:20:52.054 "seek_data": false, 00:20:52.054 "copy": true, 00:20:52.054 "nvme_iov_md": false 00:20:52.054 }, 00:20:52.054 "memory_domains": [ 00:20:52.054 { 00:20:52.054 "dma_device_id": "system", 00:20:52.054 "dma_device_type": 1 00:20:52.054 }, 00:20:52.054 { 00:20:52.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.054 "dma_device_type": 2 00:20:52.054 } 00:20:52.054 ], 00:20:52.054 "driver_specific": {} 00:20:52.054 } 00:20:52.054 ] 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.054 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.310 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.310 "name": "Existed_Raid", 00:20:52.310 "uuid": "80c19fe0-dfbb-4a4a-af7a-86a6d916dcd6", 00:20:52.310 "strip_size_kb": 0, 00:20:52.310 "state": "online", 00:20:52.310 "raid_level": "raid1", 00:20:52.310 "superblock": false, 00:20:52.310 "num_base_bdevs": 4, 00:20:52.310 "num_base_bdevs_discovered": 4, 00:20:52.310 "num_base_bdevs_operational": 4, 00:20:52.310 "base_bdevs_list": [ 00:20:52.310 { 00:20:52.310 "name": "NewBaseBdev", 00:20:52.310 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:52.310 "is_configured": true, 00:20:52.310 "data_offset": 0, 00:20:52.310 "data_size": 65536 00:20:52.310 }, 00:20:52.310 { 00:20:52.310 "name": "BaseBdev2", 00:20:52.310 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:52.310 "is_configured": true, 00:20:52.310 "data_offset": 0, 00:20:52.310 "data_size": 65536 00:20:52.310 }, 00:20:52.310 { 00:20:52.310 "name": "BaseBdev3", 00:20:52.310 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:52.310 "is_configured": true, 00:20:52.310 "data_offset": 0, 00:20:52.310 "data_size": 65536 00:20:52.310 }, 00:20:52.310 { 00:20:52.310 "name": "BaseBdev4", 00:20:52.310 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:52.310 "is_configured": true, 00:20:52.310 "data_offset": 0, 00:20:52.310 "data_size": 65536 00:20:52.310 } 00:20:52.310 ] 00:20:52.310 }' 00:20:52.310 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.310 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:53.240 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:53.497 [2024-07-15 13:39:32.759862] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:53.497 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:53.497 "name": "Existed_Raid", 00:20:53.497 "aliases": [ 00:20:53.497 "80c19fe0-dfbb-4a4a-af7a-86a6d916dcd6" 00:20:53.497 ], 00:20:53.497 "product_name": "Raid Volume", 00:20:53.497 "block_size": 512, 00:20:53.497 "num_blocks": 65536, 00:20:53.497 "uuid": "80c19fe0-dfbb-4a4a-af7a-86a6d916dcd6", 00:20:53.497 "assigned_rate_limits": { 00:20:53.497 "rw_ios_per_sec": 0, 00:20:53.497 "rw_mbytes_per_sec": 0, 00:20:53.497 "r_mbytes_per_sec": 0, 00:20:53.497 "w_mbytes_per_sec": 0 00:20:53.497 }, 00:20:53.497 "claimed": false, 00:20:53.497 "zoned": false, 00:20:53.497 "supported_io_types": { 00:20:53.497 "read": true, 00:20:53.497 "write": true, 00:20:53.497 "unmap": false, 00:20:53.497 "flush": false, 00:20:53.497 "reset": true, 00:20:53.497 "nvme_admin": false, 00:20:53.497 "nvme_io": false, 00:20:53.497 "nvme_io_md": false, 00:20:53.497 "write_zeroes": true, 00:20:53.497 "zcopy": false, 00:20:53.497 "get_zone_info": false, 00:20:53.497 "zone_management": false, 00:20:53.497 "zone_append": false, 00:20:53.497 "compare": false, 00:20:53.497 "compare_and_write": false, 00:20:53.497 "abort": false, 00:20:53.497 "seek_hole": false, 00:20:53.497 "seek_data": false, 00:20:53.497 "copy": false, 00:20:53.497 "nvme_iov_md": false 00:20:53.497 }, 00:20:53.497 "memory_domains": [ 00:20:53.497 { 00:20:53.497 "dma_device_id": "system", 00:20:53.497 "dma_device_type": 1 00:20:53.497 }, 00:20:53.497 { 00:20:53.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.497 "dma_device_type": 2 00:20:53.497 }, 00:20:53.497 { 00:20:53.497 "dma_device_id": "system", 00:20:53.497 "dma_device_type": 1 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.498 "dma_device_type": 2 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "dma_device_id": "system", 00:20:53.498 "dma_device_type": 1 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.498 "dma_device_type": 2 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "dma_device_id": "system", 00:20:53.498 "dma_device_type": 1 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.498 "dma_device_type": 2 00:20:53.498 } 00:20:53.498 ], 00:20:53.498 "driver_specific": { 00:20:53.498 "raid": { 00:20:53.498 "uuid": "80c19fe0-dfbb-4a4a-af7a-86a6d916dcd6", 00:20:53.498 "strip_size_kb": 0, 00:20:53.498 "state": "online", 00:20:53.498 "raid_level": "raid1", 00:20:53.498 "superblock": false, 00:20:53.498 "num_base_bdevs": 4, 00:20:53.498 "num_base_bdevs_discovered": 4, 00:20:53.498 "num_base_bdevs_operational": 4, 00:20:53.498 "base_bdevs_list": [ 00:20:53.498 { 00:20:53.498 "name": "NewBaseBdev", 00:20:53.498 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:53.498 "is_configured": true, 00:20:53.498 "data_offset": 0, 00:20:53.498 "data_size": 65536 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "name": "BaseBdev2", 00:20:53.498 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:53.498 "is_configured": true, 00:20:53.498 "data_offset": 0, 00:20:53.498 "data_size": 65536 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "name": "BaseBdev3", 00:20:53.498 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:53.498 "is_configured": true, 00:20:53.498 "data_offset": 0, 00:20:53.498 "data_size": 65536 00:20:53.498 }, 00:20:53.498 { 00:20:53.498 "name": "BaseBdev4", 00:20:53.498 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:53.498 "is_configured": true, 00:20:53.498 "data_offset": 0, 00:20:53.498 "data_size": 65536 00:20:53.498 } 00:20:53.498 ] 00:20:53.498 } 00:20:53.498 } 00:20:53.498 }' 00:20:53.498 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:53.498 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:53.498 BaseBdev2 00:20:53.498 BaseBdev3 00:20:53.498 BaseBdev4' 00:20:53.498 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:53.498 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:53.498 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:53.754 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:53.754 "name": "NewBaseBdev", 00:20:53.754 "aliases": [ 00:20:53.754 "a08479cb-30e7-491e-b1f4-4ecce88674c8" 00:20:53.754 ], 00:20:53.754 "product_name": "Malloc disk", 00:20:53.754 "block_size": 512, 00:20:53.754 "num_blocks": 65536, 00:20:53.754 "uuid": "a08479cb-30e7-491e-b1f4-4ecce88674c8", 00:20:53.754 "assigned_rate_limits": { 00:20:53.754 "rw_ios_per_sec": 0, 00:20:53.754 "rw_mbytes_per_sec": 0, 00:20:53.754 "r_mbytes_per_sec": 0, 00:20:53.754 "w_mbytes_per_sec": 0 00:20:53.754 }, 00:20:53.754 "claimed": true, 00:20:53.754 "claim_type": "exclusive_write", 00:20:53.754 "zoned": false, 00:20:53.754 "supported_io_types": { 00:20:53.754 "read": true, 00:20:53.754 "write": true, 00:20:53.754 "unmap": true, 00:20:53.754 "flush": true, 00:20:53.754 "reset": true, 00:20:53.754 "nvme_admin": false, 00:20:53.754 "nvme_io": false, 00:20:53.754 "nvme_io_md": false, 00:20:53.754 "write_zeroes": true, 00:20:53.754 "zcopy": true, 00:20:53.754 "get_zone_info": false, 00:20:53.754 "zone_management": false, 00:20:53.754 "zone_append": false, 00:20:53.754 "compare": false, 00:20:53.754 "compare_and_write": false, 00:20:53.754 "abort": true, 00:20:53.755 "seek_hole": false, 00:20:53.755 "seek_data": false, 00:20:53.755 "copy": true, 00:20:53.755 "nvme_iov_md": false 00:20:53.755 }, 00:20:53.755 "memory_domains": [ 00:20:53.755 { 00:20:53.755 "dma_device_id": "system", 00:20:53.755 "dma_device_type": 1 00:20:53.755 }, 00:20:53.755 { 00:20:53.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.755 "dma_device_type": 2 00:20:53.755 } 00:20:53.755 ], 00:20:53.755 "driver_specific": {} 00:20:53.755 }' 00:20:53.755 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.755 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.010 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.011 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.267 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.267 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.267 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:54.267 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.267 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.267 "name": "BaseBdev2", 00:20:54.267 "aliases": [ 00:20:54.267 "f3bd5e65-3eb2-4760-bbdc-3e6781a77894" 00:20:54.267 ], 00:20:54.267 "product_name": "Malloc disk", 00:20:54.267 "block_size": 512, 00:20:54.267 "num_blocks": 65536, 00:20:54.267 "uuid": "f3bd5e65-3eb2-4760-bbdc-3e6781a77894", 00:20:54.267 "assigned_rate_limits": { 00:20:54.267 "rw_ios_per_sec": 0, 00:20:54.267 "rw_mbytes_per_sec": 0, 00:20:54.267 "r_mbytes_per_sec": 0, 00:20:54.267 "w_mbytes_per_sec": 0 00:20:54.267 }, 00:20:54.267 "claimed": true, 00:20:54.267 "claim_type": "exclusive_write", 00:20:54.267 "zoned": false, 00:20:54.267 "supported_io_types": { 00:20:54.267 "read": true, 00:20:54.267 "write": true, 00:20:54.267 "unmap": true, 00:20:54.267 "flush": true, 00:20:54.267 "reset": true, 00:20:54.267 "nvme_admin": false, 00:20:54.267 "nvme_io": false, 00:20:54.267 "nvme_io_md": false, 00:20:54.267 "write_zeroes": true, 00:20:54.267 "zcopy": true, 00:20:54.267 "get_zone_info": false, 00:20:54.267 "zone_management": false, 00:20:54.267 "zone_append": false, 00:20:54.267 "compare": false, 00:20:54.267 "compare_and_write": false, 00:20:54.267 "abort": true, 00:20:54.267 "seek_hole": false, 00:20:54.267 "seek_data": false, 00:20:54.267 "copy": true, 00:20:54.268 "nvme_iov_md": false 00:20:54.268 }, 00:20:54.268 "memory_domains": [ 00:20:54.268 { 00:20:54.268 "dma_device_id": "system", 00:20:54.268 "dma_device_type": 1 00:20:54.268 }, 00:20:54.268 { 00:20:54.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.268 "dma_device_type": 2 00:20:54.268 } 00:20:54.268 ], 00:20:54.268 "driver_specific": {} 00:20:54.268 }' 00:20:54.268 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.524 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.780 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.780 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.780 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.781 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:54.781 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.781 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.781 "name": "BaseBdev3", 00:20:54.781 "aliases": [ 00:20:54.781 "6caf9cff-5811-4148-8392-bd2067457a25" 00:20:54.781 ], 00:20:54.781 "product_name": "Malloc disk", 00:20:54.781 "block_size": 512, 00:20:54.781 "num_blocks": 65536, 00:20:54.781 "uuid": "6caf9cff-5811-4148-8392-bd2067457a25", 00:20:54.781 "assigned_rate_limits": { 00:20:54.781 "rw_ios_per_sec": 0, 00:20:54.781 "rw_mbytes_per_sec": 0, 00:20:54.781 "r_mbytes_per_sec": 0, 00:20:54.781 "w_mbytes_per_sec": 0 00:20:54.781 }, 00:20:54.781 "claimed": true, 00:20:54.781 "claim_type": "exclusive_write", 00:20:54.781 "zoned": false, 00:20:54.781 "supported_io_types": { 00:20:54.781 "read": true, 00:20:54.781 "write": true, 00:20:54.781 "unmap": true, 00:20:54.781 "flush": true, 00:20:54.781 "reset": true, 00:20:54.781 "nvme_admin": false, 00:20:54.781 "nvme_io": false, 00:20:54.781 "nvme_io_md": false, 00:20:54.781 "write_zeroes": true, 00:20:54.781 "zcopy": true, 00:20:54.781 "get_zone_info": false, 00:20:54.781 "zone_management": false, 00:20:54.781 "zone_append": false, 00:20:54.781 "compare": false, 00:20:54.781 "compare_and_write": false, 00:20:54.781 "abort": true, 00:20:54.781 "seek_hole": false, 00:20:54.781 "seek_data": false, 00:20:54.781 "copy": true, 00:20:54.781 "nvme_iov_md": false 00:20:54.781 }, 00:20:54.781 "memory_domains": [ 00:20:54.781 { 00:20:54.781 "dma_device_id": "system", 00:20:54.781 "dma_device_type": 1 00:20:54.781 }, 00:20:54.781 { 00:20:54.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.781 "dma_device_type": 2 00:20:54.781 } 00:20:54.781 ], 00:20:54.781 "driver_specific": {} 00:20:54.781 }' 00:20:54.781 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.038 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:55.295 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.553 "name": "BaseBdev4", 00:20:55.553 "aliases": [ 00:20:55.553 "5117ff43-eda2-41d0-afa9-d1ada0511474" 00:20:55.553 ], 00:20:55.553 "product_name": "Malloc disk", 00:20:55.553 "block_size": 512, 00:20:55.553 "num_blocks": 65536, 00:20:55.553 "uuid": "5117ff43-eda2-41d0-afa9-d1ada0511474", 00:20:55.553 "assigned_rate_limits": { 00:20:55.553 "rw_ios_per_sec": 0, 00:20:55.553 "rw_mbytes_per_sec": 0, 00:20:55.553 "r_mbytes_per_sec": 0, 00:20:55.553 "w_mbytes_per_sec": 0 00:20:55.553 }, 00:20:55.553 "claimed": true, 00:20:55.553 "claim_type": "exclusive_write", 00:20:55.553 "zoned": false, 00:20:55.553 "supported_io_types": { 00:20:55.553 "read": true, 00:20:55.553 "write": true, 00:20:55.553 "unmap": true, 00:20:55.553 "flush": true, 00:20:55.553 "reset": true, 00:20:55.553 "nvme_admin": false, 00:20:55.553 "nvme_io": false, 00:20:55.553 "nvme_io_md": false, 00:20:55.553 "write_zeroes": true, 00:20:55.553 "zcopy": true, 00:20:55.553 "get_zone_info": false, 00:20:55.553 "zone_management": false, 00:20:55.553 "zone_append": false, 00:20:55.553 "compare": false, 00:20:55.553 "compare_and_write": false, 00:20:55.553 "abort": true, 00:20:55.553 "seek_hole": false, 00:20:55.553 "seek_data": false, 00:20:55.553 "copy": true, 00:20:55.553 "nvme_iov_md": false 00:20:55.553 }, 00:20:55.553 "memory_domains": [ 00:20:55.553 { 00:20:55.553 "dma_device_id": "system", 00:20:55.553 "dma_device_type": 1 00:20:55.553 }, 00:20:55.553 { 00:20:55.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.553 "dma_device_type": 2 00:20:55.553 } 00:20:55.553 ], 00:20:55.553 "driver_specific": {} 00:20:55.553 }' 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.553 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.810 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.810 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.810 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.810 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.810 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.810 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.810 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.811 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:56.068 [2024-07-15 13:39:35.390548] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:56.068 [2024-07-15 13:39:35.390575] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:56.068 [2024-07-15 13:39:35.390627] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:56.068 [2024-07-15 13:39:35.390896] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:56.068 [2024-07-15 13:39:35.390910] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xccd610 name Existed_Raid, state offline 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2158156 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2158156 ']' 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2158156 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2158156 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2158156' 00:20:56.068 killing process with pid 2158156 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2158156 00:20:56.068 [2024-07-15 13:39:35.454246] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:56.068 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2158156 00:20:56.068 [2024-07-15 13:39:35.489809] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:56.326 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:56.326 00:20:56.326 real 0m32.586s 00:20:56.326 user 0m59.902s 00:20:56.326 sys 0m5.770s 00:20:56.326 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:56.326 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.326 ************************************ 00:20:56.326 END TEST raid_state_function_test 00:20:56.326 ************************************ 00:20:56.326 13:39:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:56.326 13:39:35 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:20:56.326 13:39:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:56.326 13:39:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:56.326 13:39:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:56.584 ************************************ 00:20:56.584 START TEST raid_state_function_test_sb 00:20:56.584 ************************************ 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2163390 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2163390' 00:20:56.584 Process raid pid: 2163390 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2163390 /var/tmp/spdk-raid.sock 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2163390 ']' 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:56.584 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:56.585 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:56.585 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:56.585 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.585 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:56.585 [2024-07-15 13:39:35.838729] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:20:56.585 [2024-07-15 13:39:35.838798] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:56.585 [2024-07-15 13:39:35.967871] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:56.842 [2024-07-15 13:39:36.069702] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:56.842 [2024-07-15 13:39:36.138869] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:56.842 [2024-07-15 13:39:36.138906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:57.408 13:39:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:57.408 13:39:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:57.408 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:57.973 [2024-07-15 13:39:37.094352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:57.973 [2024-07-15 13:39:37.094395] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:57.973 [2024-07-15 13:39:37.094406] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:57.973 [2024-07-15 13:39:37.094418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:57.973 [2024-07-15 13:39:37.094427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:57.973 [2024-07-15 13:39:37.094438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:57.973 [2024-07-15 13:39:37.094447] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:57.973 [2024-07-15 13:39:37.094458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.973 "name": "Existed_Raid", 00:20:57.973 "uuid": "9dbcfd84-b87d-46ee-b3b9-bc08fe79230c", 00:20:57.973 "strip_size_kb": 0, 00:20:57.973 "state": "configuring", 00:20:57.973 "raid_level": "raid1", 00:20:57.973 "superblock": true, 00:20:57.973 "num_base_bdevs": 4, 00:20:57.973 "num_base_bdevs_discovered": 0, 00:20:57.973 "num_base_bdevs_operational": 4, 00:20:57.973 "base_bdevs_list": [ 00:20:57.973 { 00:20:57.973 "name": "BaseBdev1", 00:20:57.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.973 "is_configured": false, 00:20:57.973 "data_offset": 0, 00:20:57.973 "data_size": 0 00:20:57.973 }, 00:20:57.973 { 00:20:57.973 "name": "BaseBdev2", 00:20:57.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.973 "is_configured": false, 00:20:57.973 "data_offset": 0, 00:20:57.973 "data_size": 0 00:20:57.973 }, 00:20:57.973 { 00:20:57.973 "name": "BaseBdev3", 00:20:57.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.973 "is_configured": false, 00:20:57.973 "data_offset": 0, 00:20:57.973 "data_size": 0 00:20:57.973 }, 00:20:57.973 { 00:20:57.973 "name": "BaseBdev4", 00:20:57.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.973 "is_configured": false, 00:20:57.973 "data_offset": 0, 00:20:57.973 "data_size": 0 00:20:57.973 } 00:20:57.973 ] 00:20:57.973 }' 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.973 13:39:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.907 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:59.165 [2024-07-15 13:39:38.453798] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:59.165 [2024-07-15 13:39:38.453830] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb2aa0 name Existed_Raid, state configuring 00:20:59.165 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:59.452 [2024-07-15 13:39:38.630296] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:59.452 [2024-07-15 13:39:38.630326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:59.452 [2024-07-15 13:39:38.630336] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:59.452 [2024-07-15 13:39:38.630347] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:59.452 [2024-07-15 13:39:38.630356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:59.452 [2024-07-15 13:39:38.630367] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:59.452 [2024-07-15 13:39:38.630376] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:59.452 [2024-07-15 13:39:38.630387] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:59.452 [2024-07-15 13:39:38.816811] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:59.452 BaseBdev1 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.452 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.724 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:59.982 [ 00:20:59.982 { 00:20:59.982 "name": "BaseBdev1", 00:20:59.982 "aliases": [ 00:20:59.982 "da88a6d3-e281-4795-b85d-b2c2fdd7f940" 00:20:59.982 ], 00:20:59.982 "product_name": "Malloc disk", 00:20:59.982 "block_size": 512, 00:20:59.982 "num_blocks": 65536, 00:20:59.982 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:20:59.982 "assigned_rate_limits": { 00:20:59.982 "rw_ios_per_sec": 0, 00:20:59.982 "rw_mbytes_per_sec": 0, 00:20:59.982 "r_mbytes_per_sec": 0, 00:20:59.982 "w_mbytes_per_sec": 0 00:20:59.982 }, 00:20:59.982 "claimed": true, 00:20:59.982 "claim_type": "exclusive_write", 00:20:59.982 "zoned": false, 00:20:59.982 "supported_io_types": { 00:20:59.982 "read": true, 00:20:59.982 "write": true, 00:20:59.982 "unmap": true, 00:20:59.982 "flush": true, 00:20:59.982 "reset": true, 00:20:59.982 "nvme_admin": false, 00:20:59.982 "nvme_io": false, 00:20:59.982 "nvme_io_md": false, 00:20:59.982 "write_zeroes": true, 00:20:59.982 "zcopy": true, 00:20:59.982 "get_zone_info": false, 00:20:59.982 "zone_management": false, 00:20:59.982 "zone_append": false, 00:20:59.982 "compare": false, 00:20:59.982 "compare_and_write": false, 00:20:59.982 "abort": true, 00:20:59.982 "seek_hole": false, 00:20:59.982 "seek_data": false, 00:20:59.982 "copy": true, 00:20:59.982 "nvme_iov_md": false 00:20:59.982 }, 00:20:59.982 "memory_domains": [ 00:20:59.982 { 00:20:59.982 "dma_device_id": "system", 00:20:59.982 "dma_device_type": 1 00:20:59.982 }, 00:20:59.982 { 00:20:59.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.982 "dma_device_type": 2 00:20:59.982 } 00:20:59.982 ], 00:20:59.982 "driver_specific": {} 00:20:59.983 } 00:20:59.983 ] 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.983 "name": "Existed_Raid", 00:20:59.983 "uuid": "236775d6-7955-4831-ad24-f4178d12ca0c", 00:20:59.983 "strip_size_kb": 0, 00:20:59.983 "state": "configuring", 00:20:59.983 "raid_level": "raid1", 00:20:59.983 "superblock": true, 00:20:59.983 "num_base_bdevs": 4, 00:20:59.983 "num_base_bdevs_discovered": 1, 00:20:59.983 "num_base_bdevs_operational": 4, 00:20:59.983 "base_bdevs_list": [ 00:20:59.983 { 00:20:59.983 "name": "BaseBdev1", 00:20:59.983 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:20:59.983 "is_configured": true, 00:20:59.983 "data_offset": 2048, 00:20:59.983 "data_size": 63488 00:20:59.983 }, 00:20:59.983 { 00:20:59.983 "name": "BaseBdev2", 00:20:59.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.983 "is_configured": false, 00:20:59.983 "data_offset": 0, 00:20:59.983 "data_size": 0 00:20:59.983 }, 00:20:59.983 { 00:20:59.983 "name": "BaseBdev3", 00:20:59.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.983 "is_configured": false, 00:20:59.983 "data_offset": 0, 00:20:59.983 "data_size": 0 00:20:59.983 }, 00:20:59.983 { 00:20:59.983 "name": "BaseBdev4", 00:20:59.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.983 "is_configured": false, 00:20:59.983 "data_offset": 0, 00:20:59.983 "data_size": 0 00:20:59.983 } 00:20:59.983 ] 00:20:59.983 }' 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.983 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.547 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:01.112 [2024-07-15 13:39:40.356941] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:01.112 [2024-07-15 13:39:40.356986] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb2310 name Existed_Raid, state configuring 00:21:01.112 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:01.370 [2024-07-15 13:39:40.605630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:01.370 [2024-07-15 13:39:40.607168] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:01.370 [2024-07-15 13:39:40.607201] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:01.370 [2024-07-15 13:39:40.607212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:01.370 [2024-07-15 13:39:40.607224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:01.370 [2024-07-15 13:39:40.607233] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:01.370 [2024-07-15 13:39:40.607244] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.370 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.627 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.627 "name": "Existed_Raid", 00:21:01.627 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:01.627 "strip_size_kb": 0, 00:21:01.627 "state": "configuring", 00:21:01.627 "raid_level": "raid1", 00:21:01.627 "superblock": true, 00:21:01.627 "num_base_bdevs": 4, 00:21:01.627 "num_base_bdevs_discovered": 1, 00:21:01.627 "num_base_bdevs_operational": 4, 00:21:01.627 "base_bdevs_list": [ 00:21:01.627 { 00:21:01.627 "name": "BaseBdev1", 00:21:01.627 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:01.627 "is_configured": true, 00:21:01.627 "data_offset": 2048, 00:21:01.627 "data_size": 63488 00:21:01.627 }, 00:21:01.627 { 00:21:01.627 "name": "BaseBdev2", 00:21:01.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.627 "is_configured": false, 00:21:01.627 "data_offset": 0, 00:21:01.627 "data_size": 0 00:21:01.627 }, 00:21:01.627 { 00:21:01.627 "name": "BaseBdev3", 00:21:01.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.627 "is_configured": false, 00:21:01.627 "data_offset": 0, 00:21:01.627 "data_size": 0 00:21:01.627 }, 00:21:01.627 { 00:21:01.627 "name": "BaseBdev4", 00:21:01.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.627 "is_configured": false, 00:21:01.627 "data_offset": 0, 00:21:01.627 "data_size": 0 00:21:01.627 } 00:21:01.627 ] 00:21:01.627 }' 00:21:01.627 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.627 13:39:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:02.188 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:02.443 [2024-07-15 13:39:41.696138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:02.443 BaseBdev2 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:02.443 13:39:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:03.055 [ 00:21:03.055 { 00:21:03.055 "name": "BaseBdev2", 00:21:03.055 "aliases": [ 00:21:03.055 "f1dcb0e1-cd82-4003-ab24-d105b70290a4" 00:21:03.055 ], 00:21:03.055 "product_name": "Malloc disk", 00:21:03.055 "block_size": 512, 00:21:03.055 "num_blocks": 65536, 00:21:03.055 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:03.055 "assigned_rate_limits": { 00:21:03.055 "rw_ios_per_sec": 0, 00:21:03.055 "rw_mbytes_per_sec": 0, 00:21:03.055 "r_mbytes_per_sec": 0, 00:21:03.055 "w_mbytes_per_sec": 0 00:21:03.055 }, 00:21:03.055 "claimed": true, 00:21:03.055 "claim_type": "exclusive_write", 00:21:03.055 "zoned": false, 00:21:03.055 "supported_io_types": { 00:21:03.055 "read": true, 00:21:03.055 "write": true, 00:21:03.055 "unmap": true, 00:21:03.055 "flush": true, 00:21:03.055 "reset": true, 00:21:03.055 "nvme_admin": false, 00:21:03.055 "nvme_io": false, 00:21:03.055 "nvme_io_md": false, 00:21:03.055 "write_zeroes": true, 00:21:03.055 "zcopy": true, 00:21:03.055 "get_zone_info": false, 00:21:03.055 "zone_management": false, 00:21:03.055 "zone_append": false, 00:21:03.055 "compare": false, 00:21:03.055 "compare_and_write": false, 00:21:03.055 "abort": true, 00:21:03.055 "seek_hole": false, 00:21:03.055 "seek_data": false, 00:21:03.055 "copy": true, 00:21:03.055 "nvme_iov_md": false 00:21:03.055 }, 00:21:03.055 "memory_domains": [ 00:21:03.055 { 00:21:03.055 "dma_device_id": "system", 00:21:03.055 "dma_device_type": 1 00:21:03.055 }, 00:21:03.055 { 00:21:03.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.055 "dma_device_type": 2 00:21:03.055 } 00:21:03.055 ], 00:21:03.055 "driver_specific": {} 00:21:03.055 } 00:21:03.055 ] 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.055 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.311 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.311 "name": "Existed_Raid", 00:21:03.311 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:03.311 "strip_size_kb": 0, 00:21:03.311 "state": "configuring", 00:21:03.311 "raid_level": "raid1", 00:21:03.311 "superblock": true, 00:21:03.311 "num_base_bdevs": 4, 00:21:03.311 "num_base_bdevs_discovered": 2, 00:21:03.311 "num_base_bdevs_operational": 4, 00:21:03.311 "base_bdevs_list": [ 00:21:03.311 { 00:21:03.311 "name": "BaseBdev1", 00:21:03.311 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:03.311 "is_configured": true, 00:21:03.311 "data_offset": 2048, 00:21:03.311 "data_size": 63488 00:21:03.311 }, 00:21:03.311 { 00:21:03.311 "name": "BaseBdev2", 00:21:03.311 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:03.311 "is_configured": true, 00:21:03.311 "data_offset": 2048, 00:21:03.311 "data_size": 63488 00:21:03.311 }, 00:21:03.311 { 00:21:03.311 "name": "BaseBdev3", 00:21:03.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.312 "is_configured": false, 00:21:03.312 "data_offset": 0, 00:21:03.312 "data_size": 0 00:21:03.312 }, 00:21:03.312 { 00:21:03.312 "name": "BaseBdev4", 00:21:03.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.312 "is_configured": false, 00:21:03.312 "data_offset": 0, 00:21:03.312 "data_size": 0 00:21:03.312 } 00:21:03.312 ] 00:21:03.312 }' 00:21:03.312 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.312 13:39:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:03.880 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:04.140 [2024-07-15 13:39:43.408037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:04.140 BaseBdev3 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:04.140 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.704 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:04.704 [ 00:21:04.704 { 00:21:04.704 "name": "BaseBdev3", 00:21:04.704 "aliases": [ 00:21:04.704 "ef5197c8-b271-485b-87a5-be5b295f71c9" 00:21:04.704 ], 00:21:04.704 "product_name": "Malloc disk", 00:21:04.704 "block_size": 512, 00:21:04.704 "num_blocks": 65536, 00:21:04.704 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:04.704 "assigned_rate_limits": { 00:21:04.704 "rw_ios_per_sec": 0, 00:21:04.704 "rw_mbytes_per_sec": 0, 00:21:04.704 "r_mbytes_per_sec": 0, 00:21:04.704 "w_mbytes_per_sec": 0 00:21:04.704 }, 00:21:04.704 "claimed": true, 00:21:04.704 "claim_type": "exclusive_write", 00:21:04.704 "zoned": false, 00:21:04.704 "supported_io_types": { 00:21:04.704 "read": true, 00:21:04.704 "write": true, 00:21:04.704 "unmap": true, 00:21:04.704 "flush": true, 00:21:04.704 "reset": true, 00:21:04.704 "nvme_admin": false, 00:21:04.704 "nvme_io": false, 00:21:04.704 "nvme_io_md": false, 00:21:04.704 "write_zeroes": true, 00:21:04.704 "zcopy": true, 00:21:04.704 "get_zone_info": false, 00:21:04.704 "zone_management": false, 00:21:04.704 "zone_append": false, 00:21:04.704 "compare": false, 00:21:04.704 "compare_and_write": false, 00:21:04.704 "abort": true, 00:21:04.704 "seek_hole": false, 00:21:04.704 "seek_data": false, 00:21:04.704 "copy": true, 00:21:04.704 "nvme_iov_md": false 00:21:04.704 }, 00:21:04.704 "memory_domains": [ 00:21:04.704 { 00:21:04.704 "dma_device_id": "system", 00:21:04.704 "dma_device_type": 1 00:21:04.704 }, 00:21:04.704 { 00:21:04.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.704 "dma_device_type": 2 00:21:04.704 } 00:21:04.704 ], 00:21:04.704 "driver_specific": {} 00:21:04.704 } 00:21:04.704 ] 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.704 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.960 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.960 "name": "Existed_Raid", 00:21:04.960 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:04.960 "strip_size_kb": 0, 00:21:04.960 "state": "configuring", 00:21:04.960 "raid_level": "raid1", 00:21:04.960 "superblock": true, 00:21:04.960 "num_base_bdevs": 4, 00:21:04.960 "num_base_bdevs_discovered": 3, 00:21:04.960 "num_base_bdevs_operational": 4, 00:21:04.960 "base_bdevs_list": [ 00:21:04.960 { 00:21:04.960 "name": "BaseBdev1", 00:21:04.960 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:04.960 "is_configured": true, 00:21:04.960 "data_offset": 2048, 00:21:04.960 "data_size": 63488 00:21:04.960 }, 00:21:04.960 { 00:21:04.960 "name": "BaseBdev2", 00:21:04.960 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:04.960 "is_configured": true, 00:21:04.960 "data_offset": 2048, 00:21:04.960 "data_size": 63488 00:21:04.960 }, 00:21:04.960 { 00:21:04.960 "name": "BaseBdev3", 00:21:04.960 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:04.960 "is_configured": true, 00:21:04.960 "data_offset": 2048, 00:21:04.960 "data_size": 63488 00:21:04.960 }, 00:21:04.960 { 00:21:04.960 "name": "BaseBdev4", 00:21:04.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.960 "is_configured": false, 00:21:04.960 "data_offset": 0, 00:21:04.960 "data_size": 0 00:21:04.960 } 00:21:04.960 ] 00:21:04.960 }' 00:21:04.960 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.960 13:39:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.523 13:39:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:05.780 [2024-07-15 13:39:45.095879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:05.780 [2024-07-15 13:39:45.096062] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbb3350 00:21:05.780 [2024-07-15 13:39:45.096093] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:05.780 [2024-07-15 13:39:45.096267] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb3020 00:21:05.780 [2024-07-15 13:39:45.096389] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbb3350 00:21:05.780 [2024-07-15 13:39:45.096400] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbb3350 00:21:05.780 [2024-07-15 13:39:45.096494] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:05.780 BaseBdev4 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:05.780 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.037 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:06.602 [ 00:21:06.602 { 00:21:06.602 "name": "BaseBdev4", 00:21:06.602 "aliases": [ 00:21:06.602 "a450c394-6e1e-4b53-a40f-6cec40351bba" 00:21:06.602 ], 00:21:06.602 "product_name": "Malloc disk", 00:21:06.602 "block_size": 512, 00:21:06.602 "num_blocks": 65536, 00:21:06.602 "uuid": "a450c394-6e1e-4b53-a40f-6cec40351bba", 00:21:06.602 "assigned_rate_limits": { 00:21:06.602 "rw_ios_per_sec": 0, 00:21:06.602 "rw_mbytes_per_sec": 0, 00:21:06.602 "r_mbytes_per_sec": 0, 00:21:06.602 "w_mbytes_per_sec": 0 00:21:06.602 }, 00:21:06.602 "claimed": true, 00:21:06.602 "claim_type": "exclusive_write", 00:21:06.602 "zoned": false, 00:21:06.602 "supported_io_types": { 00:21:06.602 "read": true, 00:21:06.602 "write": true, 00:21:06.602 "unmap": true, 00:21:06.602 "flush": true, 00:21:06.602 "reset": true, 00:21:06.602 "nvme_admin": false, 00:21:06.602 "nvme_io": false, 00:21:06.602 "nvme_io_md": false, 00:21:06.602 "write_zeroes": true, 00:21:06.602 "zcopy": true, 00:21:06.602 "get_zone_info": false, 00:21:06.602 "zone_management": false, 00:21:06.602 "zone_append": false, 00:21:06.602 "compare": false, 00:21:06.602 "compare_and_write": false, 00:21:06.602 "abort": true, 00:21:06.602 "seek_hole": false, 00:21:06.602 "seek_data": false, 00:21:06.602 "copy": true, 00:21:06.602 "nvme_iov_md": false 00:21:06.602 }, 00:21:06.602 "memory_domains": [ 00:21:06.602 { 00:21:06.602 "dma_device_id": "system", 00:21:06.602 "dma_device_type": 1 00:21:06.602 }, 00:21:06.602 { 00:21:06.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.602 "dma_device_type": 2 00:21:06.602 } 00:21:06.602 ], 00:21:06.602 "driver_specific": {} 00:21:06.602 } 00:21:06.602 ] 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.602 13:39:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.602 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.602 "name": "Existed_Raid", 00:21:06.602 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:06.602 "strip_size_kb": 0, 00:21:06.602 "state": "online", 00:21:06.602 "raid_level": "raid1", 00:21:06.602 "superblock": true, 00:21:06.602 "num_base_bdevs": 4, 00:21:06.602 "num_base_bdevs_discovered": 4, 00:21:06.603 "num_base_bdevs_operational": 4, 00:21:06.603 "base_bdevs_list": [ 00:21:06.603 { 00:21:06.603 "name": "BaseBdev1", 00:21:06.603 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:06.603 "is_configured": true, 00:21:06.603 "data_offset": 2048, 00:21:06.603 "data_size": 63488 00:21:06.603 }, 00:21:06.603 { 00:21:06.603 "name": "BaseBdev2", 00:21:06.603 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:06.603 "is_configured": true, 00:21:06.603 "data_offset": 2048, 00:21:06.603 "data_size": 63488 00:21:06.603 }, 00:21:06.603 { 00:21:06.603 "name": "BaseBdev3", 00:21:06.603 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:06.603 "is_configured": true, 00:21:06.603 "data_offset": 2048, 00:21:06.603 "data_size": 63488 00:21:06.603 }, 00:21:06.603 { 00:21:06.603 "name": "BaseBdev4", 00:21:06.603 "uuid": "a450c394-6e1e-4b53-a40f-6cec40351bba", 00:21:06.603 "is_configured": true, 00:21:06.603 "data_offset": 2048, 00:21:06.603 "data_size": 63488 00:21:06.603 } 00:21:06.603 ] 00:21:06.603 }' 00:21:06.603 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.603 13:39:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:07.169 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:07.446 [2024-07-15 13:39:46.752613] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:07.446 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:07.446 "name": "Existed_Raid", 00:21:07.446 "aliases": [ 00:21:07.446 "7c94df06-135a-478c-a386-03e0d45fb915" 00:21:07.446 ], 00:21:07.446 "product_name": "Raid Volume", 00:21:07.446 "block_size": 512, 00:21:07.446 "num_blocks": 63488, 00:21:07.446 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:07.446 "assigned_rate_limits": { 00:21:07.446 "rw_ios_per_sec": 0, 00:21:07.446 "rw_mbytes_per_sec": 0, 00:21:07.446 "r_mbytes_per_sec": 0, 00:21:07.446 "w_mbytes_per_sec": 0 00:21:07.446 }, 00:21:07.446 "claimed": false, 00:21:07.446 "zoned": false, 00:21:07.446 "supported_io_types": { 00:21:07.446 "read": true, 00:21:07.446 "write": true, 00:21:07.446 "unmap": false, 00:21:07.446 "flush": false, 00:21:07.446 "reset": true, 00:21:07.446 "nvme_admin": false, 00:21:07.446 "nvme_io": false, 00:21:07.446 "nvme_io_md": false, 00:21:07.446 "write_zeroes": true, 00:21:07.446 "zcopy": false, 00:21:07.446 "get_zone_info": false, 00:21:07.446 "zone_management": false, 00:21:07.446 "zone_append": false, 00:21:07.446 "compare": false, 00:21:07.446 "compare_and_write": false, 00:21:07.446 "abort": false, 00:21:07.446 "seek_hole": false, 00:21:07.446 "seek_data": false, 00:21:07.446 "copy": false, 00:21:07.446 "nvme_iov_md": false 00:21:07.446 }, 00:21:07.446 "memory_domains": [ 00:21:07.446 { 00:21:07.446 "dma_device_id": "system", 00:21:07.446 "dma_device_type": 1 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.446 "dma_device_type": 2 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "system", 00:21:07.446 "dma_device_type": 1 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.446 "dma_device_type": 2 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "system", 00:21:07.446 "dma_device_type": 1 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.446 "dma_device_type": 2 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "system", 00:21:07.446 "dma_device_type": 1 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.446 "dma_device_type": 2 00:21:07.446 } 00:21:07.446 ], 00:21:07.446 "driver_specific": { 00:21:07.446 "raid": { 00:21:07.446 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:07.446 "strip_size_kb": 0, 00:21:07.446 "state": "online", 00:21:07.446 "raid_level": "raid1", 00:21:07.446 "superblock": true, 00:21:07.446 "num_base_bdevs": 4, 00:21:07.446 "num_base_bdevs_discovered": 4, 00:21:07.446 "num_base_bdevs_operational": 4, 00:21:07.446 "base_bdevs_list": [ 00:21:07.446 { 00:21:07.446 "name": "BaseBdev1", 00:21:07.446 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:07.446 "is_configured": true, 00:21:07.446 "data_offset": 2048, 00:21:07.446 "data_size": 63488 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "name": "BaseBdev2", 00:21:07.446 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:07.446 "is_configured": true, 00:21:07.446 "data_offset": 2048, 00:21:07.446 "data_size": 63488 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "name": "BaseBdev3", 00:21:07.446 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:07.446 "is_configured": true, 00:21:07.446 "data_offset": 2048, 00:21:07.446 "data_size": 63488 00:21:07.446 }, 00:21:07.446 { 00:21:07.446 "name": "BaseBdev4", 00:21:07.446 "uuid": "a450c394-6e1e-4b53-a40f-6cec40351bba", 00:21:07.446 "is_configured": true, 00:21:07.446 "data_offset": 2048, 00:21:07.446 "data_size": 63488 00:21:07.446 } 00:21:07.446 ] 00:21:07.446 } 00:21:07.446 } 00:21:07.446 }' 00:21:07.446 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:07.446 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:07.446 BaseBdev2 00:21:07.446 BaseBdev3 00:21:07.446 BaseBdev4' 00:21:07.447 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.447 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:07.447 13:39:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.705 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.705 "name": "BaseBdev1", 00:21:07.705 "aliases": [ 00:21:07.705 "da88a6d3-e281-4795-b85d-b2c2fdd7f940" 00:21:07.705 ], 00:21:07.705 "product_name": "Malloc disk", 00:21:07.705 "block_size": 512, 00:21:07.705 "num_blocks": 65536, 00:21:07.705 "uuid": "da88a6d3-e281-4795-b85d-b2c2fdd7f940", 00:21:07.705 "assigned_rate_limits": { 00:21:07.705 "rw_ios_per_sec": 0, 00:21:07.705 "rw_mbytes_per_sec": 0, 00:21:07.705 "r_mbytes_per_sec": 0, 00:21:07.705 "w_mbytes_per_sec": 0 00:21:07.705 }, 00:21:07.705 "claimed": true, 00:21:07.705 "claim_type": "exclusive_write", 00:21:07.705 "zoned": false, 00:21:07.705 "supported_io_types": { 00:21:07.705 "read": true, 00:21:07.705 "write": true, 00:21:07.705 "unmap": true, 00:21:07.705 "flush": true, 00:21:07.705 "reset": true, 00:21:07.705 "nvme_admin": false, 00:21:07.705 "nvme_io": false, 00:21:07.705 "nvme_io_md": false, 00:21:07.705 "write_zeroes": true, 00:21:07.705 "zcopy": true, 00:21:07.705 "get_zone_info": false, 00:21:07.705 "zone_management": false, 00:21:07.705 "zone_append": false, 00:21:07.705 "compare": false, 00:21:07.705 "compare_and_write": false, 00:21:07.705 "abort": true, 00:21:07.705 "seek_hole": false, 00:21:07.705 "seek_data": false, 00:21:07.705 "copy": true, 00:21:07.705 "nvme_iov_md": false 00:21:07.705 }, 00:21:07.705 "memory_domains": [ 00:21:07.705 { 00:21:07.705 "dma_device_id": "system", 00:21:07.705 "dma_device_type": 1 00:21:07.705 }, 00:21:07.705 { 00:21:07.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.705 "dma_device_type": 2 00:21:07.705 } 00:21:07.705 ], 00:21:07.705 "driver_specific": {} 00:21:07.705 }' 00:21:07.705 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.963 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.220 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.220 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.220 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.220 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:08.220 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.478 "name": "BaseBdev2", 00:21:08.478 "aliases": [ 00:21:08.478 "f1dcb0e1-cd82-4003-ab24-d105b70290a4" 00:21:08.478 ], 00:21:08.478 "product_name": "Malloc disk", 00:21:08.478 "block_size": 512, 00:21:08.478 "num_blocks": 65536, 00:21:08.478 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:08.478 "assigned_rate_limits": { 00:21:08.478 "rw_ios_per_sec": 0, 00:21:08.478 "rw_mbytes_per_sec": 0, 00:21:08.478 "r_mbytes_per_sec": 0, 00:21:08.478 "w_mbytes_per_sec": 0 00:21:08.478 }, 00:21:08.478 "claimed": true, 00:21:08.478 "claim_type": "exclusive_write", 00:21:08.478 "zoned": false, 00:21:08.478 "supported_io_types": { 00:21:08.478 "read": true, 00:21:08.478 "write": true, 00:21:08.478 "unmap": true, 00:21:08.478 "flush": true, 00:21:08.478 "reset": true, 00:21:08.478 "nvme_admin": false, 00:21:08.478 "nvme_io": false, 00:21:08.478 "nvme_io_md": false, 00:21:08.478 "write_zeroes": true, 00:21:08.478 "zcopy": true, 00:21:08.478 "get_zone_info": false, 00:21:08.478 "zone_management": false, 00:21:08.478 "zone_append": false, 00:21:08.478 "compare": false, 00:21:08.478 "compare_and_write": false, 00:21:08.478 "abort": true, 00:21:08.478 "seek_hole": false, 00:21:08.478 "seek_data": false, 00:21:08.478 "copy": true, 00:21:08.478 "nvme_iov_md": false 00:21:08.478 }, 00:21:08.478 "memory_domains": [ 00:21:08.478 { 00:21:08.478 "dma_device_id": "system", 00:21:08.478 "dma_device_type": 1 00:21:08.478 }, 00:21:08.478 { 00:21:08.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.478 "dma_device_type": 2 00:21:08.478 } 00:21:08.478 ], 00:21:08.478 "driver_specific": {} 00:21:08.478 }' 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.478 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.737 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.737 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:08.737 13:39:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.737 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.737 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.737 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.737 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:08.737 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.995 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.995 "name": "BaseBdev3", 00:21:08.995 "aliases": [ 00:21:08.995 "ef5197c8-b271-485b-87a5-be5b295f71c9" 00:21:08.995 ], 00:21:08.995 "product_name": "Malloc disk", 00:21:08.995 "block_size": 512, 00:21:08.995 "num_blocks": 65536, 00:21:08.995 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:08.995 "assigned_rate_limits": { 00:21:08.995 "rw_ios_per_sec": 0, 00:21:08.995 "rw_mbytes_per_sec": 0, 00:21:08.995 "r_mbytes_per_sec": 0, 00:21:08.995 "w_mbytes_per_sec": 0 00:21:08.995 }, 00:21:08.995 "claimed": true, 00:21:08.995 "claim_type": "exclusive_write", 00:21:08.995 "zoned": false, 00:21:08.995 "supported_io_types": { 00:21:08.995 "read": true, 00:21:08.995 "write": true, 00:21:08.995 "unmap": true, 00:21:08.995 "flush": true, 00:21:08.995 "reset": true, 00:21:08.995 "nvme_admin": false, 00:21:08.995 "nvme_io": false, 00:21:08.995 "nvme_io_md": false, 00:21:08.995 "write_zeroes": true, 00:21:08.995 "zcopy": true, 00:21:08.995 "get_zone_info": false, 00:21:08.995 "zone_management": false, 00:21:08.995 "zone_append": false, 00:21:08.995 "compare": false, 00:21:08.995 "compare_and_write": false, 00:21:08.995 "abort": true, 00:21:08.995 "seek_hole": false, 00:21:08.995 "seek_data": false, 00:21:08.995 "copy": true, 00:21:08.995 "nvme_iov_md": false 00:21:08.995 }, 00:21:08.995 "memory_domains": [ 00:21:08.995 { 00:21:08.995 "dma_device_id": "system", 00:21:08.995 "dma_device_type": 1 00:21:08.995 }, 00:21:08.995 { 00:21:08.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.995 "dma_device_type": 2 00:21:08.995 } 00:21:08.995 ], 00:21:08.995 "driver_specific": {} 00:21:08.995 }' 00:21:08.995 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.995 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.995 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.995 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:09.253 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.509 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.509 "name": "BaseBdev4", 00:21:09.509 "aliases": [ 00:21:09.509 "a450c394-6e1e-4b53-a40f-6cec40351bba" 00:21:09.509 ], 00:21:09.509 "product_name": "Malloc disk", 00:21:09.509 "block_size": 512, 00:21:09.509 "num_blocks": 65536, 00:21:09.509 "uuid": "a450c394-6e1e-4b53-a40f-6cec40351bba", 00:21:09.509 "assigned_rate_limits": { 00:21:09.509 "rw_ios_per_sec": 0, 00:21:09.509 "rw_mbytes_per_sec": 0, 00:21:09.509 "r_mbytes_per_sec": 0, 00:21:09.509 "w_mbytes_per_sec": 0 00:21:09.509 }, 00:21:09.509 "claimed": true, 00:21:09.509 "claim_type": "exclusive_write", 00:21:09.509 "zoned": false, 00:21:09.509 "supported_io_types": { 00:21:09.509 "read": true, 00:21:09.509 "write": true, 00:21:09.509 "unmap": true, 00:21:09.509 "flush": true, 00:21:09.509 "reset": true, 00:21:09.509 "nvme_admin": false, 00:21:09.509 "nvme_io": false, 00:21:09.509 "nvme_io_md": false, 00:21:09.509 "write_zeroes": true, 00:21:09.509 "zcopy": true, 00:21:09.509 "get_zone_info": false, 00:21:09.509 "zone_management": false, 00:21:09.509 "zone_append": false, 00:21:09.509 "compare": false, 00:21:09.509 "compare_and_write": false, 00:21:09.509 "abort": true, 00:21:09.509 "seek_hole": false, 00:21:09.509 "seek_data": false, 00:21:09.509 "copy": true, 00:21:09.509 "nvme_iov_md": false 00:21:09.509 }, 00:21:09.509 "memory_domains": [ 00:21:09.509 { 00:21:09.509 "dma_device_id": "system", 00:21:09.509 "dma_device_type": 1 00:21:09.509 }, 00:21:09.509 { 00:21:09.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.509 "dma_device_type": 2 00:21:09.509 } 00:21:09.509 ], 00:21:09.509 "driver_specific": {} 00:21:09.509 }' 00:21:09.509 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.801 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.801 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.801 13:39:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.801 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.802 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:10.059 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:10.059 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:10.317 [2024-07-15 13:39:49.487585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.317 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.576 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.576 "name": "Existed_Raid", 00:21:10.576 "uuid": "7c94df06-135a-478c-a386-03e0d45fb915", 00:21:10.576 "strip_size_kb": 0, 00:21:10.576 "state": "online", 00:21:10.576 "raid_level": "raid1", 00:21:10.576 "superblock": true, 00:21:10.576 "num_base_bdevs": 4, 00:21:10.576 "num_base_bdevs_discovered": 3, 00:21:10.576 "num_base_bdevs_operational": 3, 00:21:10.576 "base_bdevs_list": [ 00:21:10.576 { 00:21:10.576 "name": null, 00:21:10.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.576 "is_configured": false, 00:21:10.576 "data_offset": 2048, 00:21:10.576 "data_size": 63488 00:21:10.576 }, 00:21:10.576 { 00:21:10.576 "name": "BaseBdev2", 00:21:10.576 "uuid": "f1dcb0e1-cd82-4003-ab24-d105b70290a4", 00:21:10.576 "is_configured": true, 00:21:10.576 "data_offset": 2048, 00:21:10.576 "data_size": 63488 00:21:10.576 }, 00:21:10.576 { 00:21:10.576 "name": "BaseBdev3", 00:21:10.576 "uuid": "ef5197c8-b271-485b-87a5-be5b295f71c9", 00:21:10.576 "is_configured": true, 00:21:10.576 "data_offset": 2048, 00:21:10.576 "data_size": 63488 00:21:10.576 }, 00:21:10.576 { 00:21:10.576 "name": "BaseBdev4", 00:21:10.576 "uuid": "a450c394-6e1e-4b53-a40f-6cec40351bba", 00:21:10.576 "is_configured": true, 00:21:10.576 "data_offset": 2048, 00:21:10.576 "data_size": 63488 00:21:10.576 } 00:21:10.576 ] 00:21:10.576 }' 00:21:10.576 13:39:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.576 13:39:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:11.139 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:11.140 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:11.140 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.140 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:11.396 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:11.396 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:11.396 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:11.654 [2024-07-15 13:39:50.828438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:11.654 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:11.654 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:11.654 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.654 13:39:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:11.911 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:11.911 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:11.911 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:11.911 [2024-07-15 13:39:51.316348] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:12.169 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:12.169 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:12.169 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.169 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:12.426 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:12.426 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:12.426 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:12.426 [2024-07-15 13:39:51.820096] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:12.426 [2024-07-15 13:39:51.820174] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:12.426 [2024-07-15 13:39:51.830998] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:12.426 [2024-07-15 13:39:51.831035] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:12.426 [2024-07-15 13:39:51.831047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb3350 name Existed_Raid, state offline 00:21:12.683 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:12.683 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:12.683 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.683 13:39:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:12.683 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:12.941 BaseBdev2 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:12.941 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:13.198 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:13.455 [ 00:21:13.455 { 00:21:13.455 "name": "BaseBdev2", 00:21:13.455 "aliases": [ 00:21:13.455 "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62" 00:21:13.455 ], 00:21:13.455 "product_name": "Malloc disk", 00:21:13.455 "block_size": 512, 00:21:13.455 "num_blocks": 65536, 00:21:13.455 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:13.455 "assigned_rate_limits": { 00:21:13.455 "rw_ios_per_sec": 0, 00:21:13.455 "rw_mbytes_per_sec": 0, 00:21:13.455 "r_mbytes_per_sec": 0, 00:21:13.455 "w_mbytes_per_sec": 0 00:21:13.455 }, 00:21:13.455 "claimed": false, 00:21:13.455 "zoned": false, 00:21:13.455 "supported_io_types": { 00:21:13.455 "read": true, 00:21:13.455 "write": true, 00:21:13.455 "unmap": true, 00:21:13.455 "flush": true, 00:21:13.455 "reset": true, 00:21:13.455 "nvme_admin": false, 00:21:13.455 "nvme_io": false, 00:21:13.455 "nvme_io_md": false, 00:21:13.455 "write_zeroes": true, 00:21:13.455 "zcopy": true, 00:21:13.455 "get_zone_info": false, 00:21:13.455 "zone_management": false, 00:21:13.455 "zone_append": false, 00:21:13.455 "compare": false, 00:21:13.455 "compare_and_write": false, 00:21:13.455 "abort": true, 00:21:13.455 "seek_hole": false, 00:21:13.455 "seek_data": false, 00:21:13.455 "copy": true, 00:21:13.455 "nvme_iov_md": false 00:21:13.455 }, 00:21:13.455 "memory_domains": [ 00:21:13.455 { 00:21:13.455 "dma_device_id": "system", 00:21:13.455 "dma_device_type": 1 00:21:13.455 }, 00:21:13.455 { 00:21:13.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.455 "dma_device_type": 2 00:21:13.455 } 00:21:13.455 ], 00:21:13.455 "driver_specific": {} 00:21:13.455 } 00:21:13.455 ] 00:21:13.455 13:39:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:13.455 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:13.455 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:13.455 13:39:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:13.712 BaseBdev3 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:13.712 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:13.970 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:14.228 [ 00:21:14.228 { 00:21:14.228 "name": "BaseBdev3", 00:21:14.228 "aliases": [ 00:21:14.228 "7dad5a71-c7ea-4593-81b4-63f1b70b2403" 00:21:14.228 ], 00:21:14.228 "product_name": "Malloc disk", 00:21:14.228 "block_size": 512, 00:21:14.228 "num_blocks": 65536, 00:21:14.228 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:14.228 "assigned_rate_limits": { 00:21:14.228 "rw_ios_per_sec": 0, 00:21:14.228 "rw_mbytes_per_sec": 0, 00:21:14.228 "r_mbytes_per_sec": 0, 00:21:14.228 "w_mbytes_per_sec": 0 00:21:14.228 }, 00:21:14.228 "claimed": false, 00:21:14.228 "zoned": false, 00:21:14.228 "supported_io_types": { 00:21:14.228 "read": true, 00:21:14.228 "write": true, 00:21:14.228 "unmap": true, 00:21:14.228 "flush": true, 00:21:14.228 "reset": true, 00:21:14.228 "nvme_admin": false, 00:21:14.228 "nvme_io": false, 00:21:14.228 "nvme_io_md": false, 00:21:14.228 "write_zeroes": true, 00:21:14.228 "zcopy": true, 00:21:14.228 "get_zone_info": false, 00:21:14.228 "zone_management": false, 00:21:14.228 "zone_append": false, 00:21:14.228 "compare": false, 00:21:14.228 "compare_and_write": false, 00:21:14.228 "abort": true, 00:21:14.228 "seek_hole": false, 00:21:14.228 "seek_data": false, 00:21:14.228 "copy": true, 00:21:14.228 "nvme_iov_md": false 00:21:14.228 }, 00:21:14.228 "memory_domains": [ 00:21:14.228 { 00:21:14.228 "dma_device_id": "system", 00:21:14.228 "dma_device_type": 1 00:21:14.228 }, 00:21:14.228 { 00:21:14.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.228 "dma_device_type": 2 00:21:14.228 } 00:21:14.228 ], 00:21:14.228 "driver_specific": {} 00:21:14.228 } 00:21:14.228 ] 00:21:14.228 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:14.228 13:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:14.228 13:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:14.228 13:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:14.496 BaseBdev4 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:14.496 13:39:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:14.753 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:15.010 [ 00:21:15.010 { 00:21:15.010 "name": "BaseBdev4", 00:21:15.010 "aliases": [ 00:21:15.010 "d03daf1e-6102-4a34-a0f7-31c113db2aa2" 00:21:15.010 ], 00:21:15.010 "product_name": "Malloc disk", 00:21:15.010 "block_size": 512, 00:21:15.010 "num_blocks": 65536, 00:21:15.010 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:15.010 "assigned_rate_limits": { 00:21:15.010 "rw_ios_per_sec": 0, 00:21:15.010 "rw_mbytes_per_sec": 0, 00:21:15.010 "r_mbytes_per_sec": 0, 00:21:15.010 "w_mbytes_per_sec": 0 00:21:15.010 }, 00:21:15.010 "claimed": false, 00:21:15.010 "zoned": false, 00:21:15.010 "supported_io_types": { 00:21:15.010 "read": true, 00:21:15.010 "write": true, 00:21:15.010 "unmap": true, 00:21:15.010 "flush": true, 00:21:15.010 "reset": true, 00:21:15.010 "nvme_admin": false, 00:21:15.010 "nvme_io": false, 00:21:15.010 "nvme_io_md": false, 00:21:15.010 "write_zeroes": true, 00:21:15.010 "zcopy": true, 00:21:15.010 "get_zone_info": false, 00:21:15.010 "zone_management": false, 00:21:15.010 "zone_append": false, 00:21:15.010 "compare": false, 00:21:15.010 "compare_and_write": false, 00:21:15.010 "abort": true, 00:21:15.010 "seek_hole": false, 00:21:15.010 "seek_data": false, 00:21:15.010 "copy": true, 00:21:15.010 "nvme_iov_md": false 00:21:15.010 }, 00:21:15.010 "memory_domains": [ 00:21:15.010 { 00:21:15.010 "dma_device_id": "system", 00:21:15.010 "dma_device_type": 1 00:21:15.010 }, 00:21:15.010 { 00:21:15.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.010 "dma_device_type": 2 00:21:15.010 } 00:21:15.010 ], 00:21:15.010 "driver_specific": {} 00:21:15.010 } 00:21:15.010 ] 00:21:15.010 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:15.010 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:15.010 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:15.010 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:15.267 [2024-07-15 13:39:54.524487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:15.267 [2024-07-15 13:39:54.524528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:15.267 [2024-07-15 13:39:54.524547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:15.267 [2024-07-15 13:39:54.525957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:15.267 [2024-07-15 13:39:54.525999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.267 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.524 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.525 "name": "Existed_Raid", 00:21:15.525 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:15.525 "strip_size_kb": 0, 00:21:15.525 "state": "configuring", 00:21:15.525 "raid_level": "raid1", 00:21:15.525 "superblock": true, 00:21:15.525 "num_base_bdevs": 4, 00:21:15.525 "num_base_bdevs_discovered": 3, 00:21:15.525 "num_base_bdevs_operational": 4, 00:21:15.525 "base_bdevs_list": [ 00:21:15.525 { 00:21:15.525 "name": "BaseBdev1", 00:21:15.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.525 "is_configured": false, 00:21:15.525 "data_offset": 0, 00:21:15.525 "data_size": 0 00:21:15.525 }, 00:21:15.525 { 00:21:15.525 "name": "BaseBdev2", 00:21:15.525 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:15.525 "is_configured": true, 00:21:15.525 "data_offset": 2048, 00:21:15.525 "data_size": 63488 00:21:15.525 }, 00:21:15.525 { 00:21:15.525 "name": "BaseBdev3", 00:21:15.525 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:15.525 "is_configured": true, 00:21:15.525 "data_offset": 2048, 00:21:15.525 "data_size": 63488 00:21:15.525 }, 00:21:15.525 { 00:21:15.525 "name": "BaseBdev4", 00:21:15.525 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:15.525 "is_configured": true, 00:21:15.525 "data_offset": 2048, 00:21:15.525 "data_size": 63488 00:21:15.525 } 00:21:15.525 ] 00:21:15.525 }' 00:21:15.525 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.525 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.086 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:16.343 [2024-07-15 13:39:55.595302] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.343 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.601 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.601 "name": "Existed_Raid", 00:21:16.601 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:16.601 "strip_size_kb": 0, 00:21:16.601 "state": "configuring", 00:21:16.601 "raid_level": "raid1", 00:21:16.601 "superblock": true, 00:21:16.601 "num_base_bdevs": 4, 00:21:16.601 "num_base_bdevs_discovered": 2, 00:21:16.601 "num_base_bdevs_operational": 4, 00:21:16.601 "base_bdevs_list": [ 00:21:16.601 { 00:21:16.601 "name": "BaseBdev1", 00:21:16.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.601 "is_configured": false, 00:21:16.601 "data_offset": 0, 00:21:16.601 "data_size": 0 00:21:16.601 }, 00:21:16.601 { 00:21:16.601 "name": null, 00:21:16.601 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:16.601 "is_configured": false, 00:21:16.601 "data_offset": 2048, 00:21:16.601 "data_size": 63488 00:21:16.601 }, 00:21:16.601 { 00:21:16.601 "name": "BaseBdev3", 00:21:16.601 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:16.601 "is_configured": true, 00:21:16.601 "data_offset": 2048, 00:21:16.601 "data_size": 63488 00:21:16.601 }, 00:21:16.601 { 00:21:16.601 "name": "BaseBdev4", 00:21:16.601 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:16.601 "is_configured": true, 00:21:16.601 "data_offset": 2048, 00:21:16.601 "data_size": 63488 00:21:16.601 } 00:21:16.601 ] 00:21:16.601 }' 00:21:16.601 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.601 13:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:17.165 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:17.165 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:17.422 [2024-07-15 13:39:56.769780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:17.422 BaseBdev1 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:17.422 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:17.713 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:17.993 [ 00:21:17.993 { 00:21:17.993 "name": "BaseBdev1", 00:21:17.993 "aliases": [ 00:21:17.993 "68cf3c0e-8746-4907-9459-e7b023649b5b" 00:21:17.993 ], 00:21:17.993 "product_name": "Malloc disk", 00:21:17.993 "block_size": 512, 00:21:17.993 "num_blocks": 65536, 00:21:17.993 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:17.993 "assigned_rate_limits": { 00:21:17.993 "rw_ios_per_sec": 0, 00:21:17.993 "rw_mbytes_per_sec": 0, 00:21:17.993 "r_mbytes_per_sec": 0, 00:21:17.993 "w_mbytes_per_sec": 0 00:21:17.993 }, 00:21:17.993 "claimed": true, 00:21:17.993 "claim_type": "exclusive_write", 00:21:17.993 "zoned": false, 00:21:17.993 "supported_io_types": { 00:21:17.993 "read": true, 00:21:17.993 "write": true, 00:21:17.993 "unmap": true, 00:21:17.993 "flush": true, 00:21:17.993 "reset": true, 00:21:17.993 "nvme_admin": false, 00:21:17.994 "nvme_io": false, 00:21:17.994 "nvme_io_md": false, 00:21:17.994 "write_zeroes": true, 00:21:17.994 "zcopy": true, 00:21:17.994 "get_zone_info": false, 00:21:17.994 "zone_management": false, 00:21:17.994 "zone_append": false, 00:21:17.994 "compare": false, 00:21:17.994 "compare_and_write": false, 00:21:17.994 "abort": true, 00:21:17.994 "seek_hole": false, 00:21:17.994 "seek_data": false, 00:21:17.994 "copy": true, 00:21:17.994 "nvme_iov_md": false 00:21:17.994 }, 00:21:17.994 "memory_domains": [ 00:21:17.994 { 00:21:17.994 "dma_device_id": "system", 00:21:17.994 "dma_device_type": 1 00:21:17.994 }, 00:21:17.994 { 00:21:17.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.994 "dma_device_type": 2 00:21:17.994 } 00:21:17.994 ], 00:21:17.994 "driver_specific": {} 00:21:17.994 } 00:21:17.994 ] 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.994 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:18.252 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.252 "name": "Existed_Raid", 00:21:18.252 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:18.252 "strip_size_kb": 0, 00:21:18.252 "state": "configuring", 00:21:18.252 "raid_level": "raid1", 00:21:18.252 "superblock": true, 00:21:18.252 "num_base_bdevs": 4, 00:21:18.252 "num_base_bdevs_discovered": 3, 00:21:18.252 "num_base_bdevs_operational": 4, 00:21:18.252 "base_bdevs_list": [ 00:21:18.252 { 00:21:18.252 "name": "BaseBdev1", 00:21:18.252 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:18.252 "is_configured": true, 00:21:18.252 "data_offset": 2048, 00:21:18.252 "data_size": 63488 00:21:18.252 }, 00:21:18.252 { 00:21:18.252 "name": null, 00:21:18.252 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:18.252 "is_configured": false, 00:21:18.252 "data_offset": 2048, 00:21:18.252 "data_size": 63488 00:21:18.252 }, 00:21:18.252 { 00:21:18.252 "name": "BaseBdev3", 00:21:18.252 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:18.252 "is_configured": true, 00:21:18.252 "data_offset": 2048, 00:21:18.252 "data_size": 63488 00:21:18.252 }, 00:21:18.252 { 00:21:18.252 "name": "BaseBdev4", 00:21:18.252 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:18.252 "is_configured": true, 00:21:18.252 "data_offset": 2048, 00:21:18.252 "data_size": 63488 00:21:18.252 } 00:21:18.252 ] 00:21:18.252 }' 00:21:18.252 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.252 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:18.834 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:18.834 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.093 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:19.093 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:19.351 [2024-07-15 13:39:58.566561] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.351 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.609 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.609 "name": "Existed_Raid", 00:21:19.609 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:19.609 "strip_size_kb": 0, 00:21:19.609 "state": "configuring", 00:21:19.609 "raid_level": "raid1", 00:21:19.609 "superblock": true, 00:21:19.609 "num_base_bdevs": 4, 00:21:19.609 "num_base_bdevs_discovered": 2, 00:21:19.609 "num_base_bdevs_operational": 4, 00:21:19.609 "base_bdevs_list": [ 00:21:19.609 { 00:21:19.609 "name": "BaseBdev1", 00:21:19.609 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:19.609 "is_configured": true, 00:21:19.609 "data_offset": 2048, 00:21:19.609 "data_size": 63488 00:21:19.609 }, 00:21:19.609 { 00:21:19.609 "name": null, 00:21:19.609 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:19.609 "is_configured": false, 00:21:19.609 "data_offset": 2048, 00:21:19.609 "data_size": 63488 00:21:19.609 }, 00:21:19.609 { 00:21:19.609 "name": null, 00:21:19.609 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:19.609 "is_configured": false, 00:21:19.609 "data_offset": 2048, 00:21:19.609 "data_size": 63488 00:21:19.609 }, 00:21:19.609 { 00:21:19.609 "name": "BaseBdev4", 00:21:19.609 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:19.609 "is_configured": true, 00:21:19.609 "data_offset": 2048, 00:21:19.609 "data_size": 63488 00:21:19.609 } 00:21:19.609 ] 00:21:19.609 }' 00:21:19.609 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.609 13:39:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:20.175 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.175 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:20.434 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:20.434 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:20.693 [2024-07-15 13:39:59.874042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.693 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.951 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.951 "name": "Existed_Raid", 00:21:20.951 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:20.951 "strip_size_kb": 0, 00:21:20.951 "state": "configuring", 00:21:20.951 "raid_level": "raid1", 00:21:20.951 "superblock": true, 00:21:20.951 "num_base_bdevs": 4, 00:21:20.951 "num_base_bdevs_discovered": 3, 00:21:20.951 "num_base_bdevs_operational": 4, 00:21:20.951 "base_bdevs_list": [ 00:21:20.951 { 00:21:20.951 "name": "BaseBdev1", 00:21:20.951 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:20.951 "is_configured": true, 00:21:20.951 "data_offset": 2048, 00:21:20.951 "data_size": 63488 00:21:20.951 }, 00:21:20.951 { 00:21:20.951 "name": null, 00:21:20.951 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:20.951 "is_configured": false, 00:21:20.951 "data_offset": 2048, 00:21:20.951 "data_size": 63488 00:21:20.951 }, 00:21:20.951 { 00:21:20.951 "name": "BaseBdev3", 00:21:20.951 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:20.951 "is_configured": true, 00:21:20.951 "data_offset": 2048, 00:21:20.951 "data_size": 63488 00:21:20.951 }, 00:21:20.951 { 00:21:20.951 "name": "BaseBdev4", 00:21:20.951 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:20.951 "is_configured": true, 00:21:20.951 "data_offset": 2048, 00:21:20.951 "data_size": 63488 00:21:20.951 } 00:21:20.951 ] 00:21:20.951 }' 00:21:20.951 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.951 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:21.517 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:21.517 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.776 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:21.776 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:21.776 [2024-07-15 13:40:01.169496] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.776 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.035 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.035 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:22.035 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.035 "name": "Existed_Raid", 00:21:22.035 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:22.035 "strip_size_kb": 0, 00:21:22.035 "state": "configuring", 00:21:22.035 "raid_level": "raid1", 00:21:22.035 "superblock": true, 00:21:22.035 "num_base_bdevs": 4, 00:21:22.035 "num_base_bdevs_discovered": 2, 00:21:22.035 "num_base_bdevs_operational": 4, 00:21:22.035 "base_bdevs_list": [ 00:21:22.035 { 00:21:22.035 "name": null, 00:21:22.035 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:22.035 "is_configured": false, 00:21:22.035 "data_offset": 2048, 00:21:22.035 "data_size": 63488 00:21:22.035 }, 00:21:22.035 { 00:21:22.035 "name": null, 00:21:22.035 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:22.035 "is_configured": false, 00:21:22.035 "data_offset": 2048, 00:21:22.035 "data_size": 63488 00:21:22.035 }, 00:21:22.035 { 00:21:22.035 "name": "BaseBdev3", 00:21:22.035 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:22.035 "is_configured": true, 00:21:22.035 "data_offset": 2048, 00:21:22.035 "data_size": 63488 00:21:22.035 }, 00:21:22.035 { 00:21:22.035 "name": "BaseBdev4", 00:21:22.035 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:22.035 "is_configured": true, 00:21:22.035 "data_offset": 2048, 00:21:22.035 "data_size": 63488 00:21:22.035 } 00:21:22.035 ] 00:21:22.035 }' 00:21:22.035 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.035 13:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:22.602 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.602 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:22.860 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:22.860 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:23.119 [2024-07-15 13:40:02.411200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.119 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:23.377 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.377 "name": "Existed_Raid", 00:21:23.377 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:23.377 "strip_size_kb": 0, 00:21:23.377 "state": "configuring", 00:21:23.377 "raid_level": "raid1", 00:21:23.377 "superblock": true, 00:21:23.377 "num_base_bdevs": 4, 00:21:23.377 "num_base_bdevs_discovered": 3, 00:21:23.377 "num_base_bdevs_operational": 4, 00:21:23.377 "base_bdevs_list": [ 00:21:23.377 { 00:21:23.377 "name": null, 00:21:23.377 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:23.377 "is_configured": false, 00:21:23.377 "data_offset": 2048, 00:21:23.377 "data_size": 63488 00:21:23.377 }, 00:21:23.377 { 00:21:23.377 "name": "BaseBdev2", 00:21:23.377 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:23.377 "is_configured": true, 00:21:23.377 "data_offset": 2048, 00:21:23.377 "data_size": 63488 00:21:23.377 }, 00:21:23.377 { 00:21:23.377 "name": "BaseBdev3", 00:21:23.377 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:23.377 "is_configured": true, 00:21:23.377 "data_offset": 2048, 00:21:23.377 "data_size": 63488 00:21:23.377 }, 00:21:23.377 { 00:21:23.377 "name": "BaseBdev4", 00:21:23.377 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:23.377 "is_configured": true, 00:21:23.377 "data_offset": 2048, 00:21:23.377 "data_size": 63488 00:21:23.377 } 00:21:23.377 ] 00:21:23.377 }' 00:21:23.377 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.377 13:40:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:23.974 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.974 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:24.231 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:24.231 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.231 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:24.487 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 68cf3c0e-8746-4907-9459-e7b023649b5b 00:21:24.743 [2024-07-15 13:40:03.975879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:24.743 [2024-07-15 13:40:03.976056] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbb5180 00:21:24.743 [2024-07-15 13:40:03.976070] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:24.743 [2024-07-15 13:40:03.976240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb5c20 00:21:24.743 [2024-07-15 13:40:03.976370] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbb5180 00:21:24.743 [2024-07-15 13:40:03.976380] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbb5180 00:21:24.743 [2024-07-15 13:40:03.976479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:24.743 NewBaseBdev 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:24.743 13:40:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:24.998 13:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:25.255 [ 00:21:25.255 { 00:21:25.255 "name": "NewBaseBdev", 00:21:25.255 "aliases": [ 00:21:25.255 "68cf3c0e-8746-4907-9459-e7b023649b5b" 00:21:25.255 ], 00:21:25.255 "product_name": "Malloc disk", 00:21:25.255 "block_size": 512, 00:21:25.255 "num_blocks": 65536, 00:21:25.255 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:25.255 "assigned_rate_limits": { 00:21:25.255 "rw_ios_per_sec": 0, 00:21:25.255 "rw_mbytes_per_sec": 0, 00:21:25.255 "r_mbytes_per_sec": 0, 00:21:25.255 "w_mbytes_per_sec": 0 00:21:25.255 }, 00:21:25.255 "claimed": true, 00:21:25.255 "claim_type": "exclusive_write", 00:21:25.255 "zoned": false, 00:21:25.255 "supported_io_types": { 00:21:25.255 "read": true, 00:21:25.255 "write": true, 00:21:25.255 "unmap": true, 00:21:25.255 "flush": true, 00:21:25.255 "reset": true, 00:21:25.255 "nvme_admin": false, 00:21:25.255 "nvme_io": false, 00:21:25.255 "nvme_io_md": false, 00:21:25.255 "write_zeroes": true, 00:21:25.255 "zcopy": true, 00:21:25.255 "get_zone_info": false, 00:21:25.255 "zone_management": false, 00:21:25.255 "zone_append": false, 00:21:25.255 "compare": false, 00:21:25.255 "compare_and_write": false, 00:21:25.255 "abort": true, 00:21:25.255 "seek_hole": false, 00:21:25.255 "seek_data": false, 00:21:25.255 "copy": true, 00:21:25.255 "nvme_iov_md": false 00:21:25.255 }, 00:21:25.255 "memory_domains": [ 00:21:25.255 { 00:21:25.255 "dma_device_id": "system", 00:21:25.255 "dma_device_type": 1 00:21:25.255 }, 00:21:25.255 { 00:21:25.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.255 "dma_device_type": 2 00:21:25.255 } 00:21:25.255 ], 00:21:25.255 "driver_specific": {} 00:21:25.255 } 00:21:25.255 ] 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.255 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:25.512 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.512 "name": "Existed_Raid", 00:21:25.512 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:25.512 "strip_size_kb": 0, 00:21:25.512 "state": "online", 00:21:25.512 "raid_level": "raid1", 00:21:25.512 "superblock": true, 00:21:25.512 "num_base_bdevs": 4, 00:21:25.512 "num_base_bdevs_discovered": 4, 00:21:25.512 "num_base_bdevs_operational": 4, 00:21:25.512 "base_bdevs_list": [ 00:21:25.512 { 00:21:25.512 "name": "NewBaseBdev", 00:21:25.512 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:25.512 "is_configured": true, 00:21:25.512 "data_offset": 2048, 00:21:25.512 "data_size": 63488 00:21:25.512 }, 00:21:25.512 { 00:21:25.512 "name": "BaseBdev2", 00:21:25.512 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:25.512 "is_configured": true, 00:21:25.512 "data_offset": 2048, 00:21:25.512 "data_size": 63488 00:21:25.512 }, 00:21:25.512 { 00:21:25.512 "name": "BaseBdev3", 00:21:25.512 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:25.512 "is_configured": true, 00:21:25.512 "data_offset": 2048, 00:21:25.512 "data_size": 63488 00:21:25.512 }, 00:21:25.512 { 00:21:25.512 "name": "BaseBdev4", 00:21:25.512 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:25.512 "is_configured": true, 00:21:25.512 "data_offset": 2048, 00:21:25.512 "data_size": 63488 00:21:25.512 } 00:21:25.512 ] 00:21:25.512 }' 00:21:25.512 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.512 13:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:26.076 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:26.334 [2024-07-15 13:40:05.540376] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:26.334 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:26.334 "name": "Existed_Raid", 00:21:26.334 "aliases": [ 00:21:26.334 "66a30d45-7093-49dc-b9ca-b169b70a595e" 00:21:26.334 ], 00:21:26.334 "product_name": "Raid Volume", 00:21:26.334 "block_size": 512, 00:21:26.334 "num_blocks": 63488, 00:21:26.334 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:26.334 "assigned_rate_limits": { 00:21:26.334 "rw_ios_per_sec": 0, 00:21:26.334 "rw_mbytes_per_sec": 0, 00:21:26.334 "r_mbytes_per_sec": 0, 00:21:26.334 "w_mbytes_per_sec": 0 00:21:26.334 }, 00:21:26.334 "claimed": false, 00:21:26.334 "zoned": false, 00:21:26.334 "supported_io_types": { 00:21:26.334 "read": true, 00:21:26.334 "write": true, 00:21:26.334 "unmap": false, 00:21:26.334 "flush": false, 00:21:26.334 "reset": true, 00:21:26.334 "nvme_admin": false, 00:21:26.334 "nvme_io": false, 00:21:26.334 "nvme_io_md": false, 00:21:26.334 "write_zeroes": true, 00:21:26.334 "zcopy": false, 00:21:26.334 "get_zone_info": false, 00:21:26.334 "zone_management": false, 00:21:26.334 "zone_append": false, 00:21:26.334 "compare": false, 00:21:26.334 "compare_and_write": false, 00:21:26.334 "abort": false, 00:21:26.334 "seek_hole": false, 00:21:26.334 "seek_data": false, 00:21:26.334 "copy": false, 00:21:26.334 "nvme_iov_md": false 00:21:26.334 }, 00:21:26.334 "memory_domains": [ 00:21:26.334 { 00:21:26.334 "dma_device_id": "system", 00:21:26.334 "dma_device_type": 1 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.334 "dma_device_type": 2 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "system", 00:21:26.334 "dma_device_type": 1 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.334 "dma_device_type": 2 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "system", 00:21:26.334 "dma_device_type": 1 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.334 "dma_device_type": 2 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "system", 00:21:26.334 "dma_device_type": 1 00:21:26.334 }, 00:21:26.334 { 00:21:26.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.334 "dma_device_type": 2 00:21:26.334 } 00:21:26.334 ], 00:21:26.334 "driver_specific": { 00:21:26.334 "raid": { 00:21:26.334 "uuid": "66a30d45-7093-49dc-b9ca-b169b70a595e", 00:21:26.334 "strip_size_kb": 0, 00:21:26.334 "state": "online", 00:21:26.334 "raid_level": "raid1", 00:21:26.334 "superblock": true, 00:21:26.334 "num_base_bdevs": 4, 00:21:26.334 "num_base_bdevs_discovered": 4, 00:21:26.334 "num_base_bdevs_operational": 4, 00:21:26.334 "base_bdevs_list": [ 00:21:26.334 { 00:21:26.334 "name": "NewBaseBdev", 00:21:26.335 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:26.335 "is_configured": true, 00:21:26.335 "data_offset": 2048, 00:21:26.335 "data_size": 63488 00:21:26.335 }, 00:21:26.335 { 00:21:26.335 "name": "BaseBdev2", 00:21:26.335 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:26.335 "is_configured": true, 00:21:26.335 "data_offset": 2048, 00:21:26.335 "data_size": 63488 00:21:26.335 }, 00:21:26.335 { 00:21:26.335 "name": "BaseBdev3", 00:21:26.335 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:26.335 "is_configured": true, 00:21:26.335 "data_offset": 2048, 00:21:26.335 "data_size": 63488 00:21:26.335 }, 00:21:26.335 { 00:21:26.335 "name": "BaseBdev4", 00:21:26.335 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:26.335 "is_configured": true, 00:21:26.335 "data_offset": 2048, 00:21:26.335 "data_size": 63488 00:21:26.335 } 00:21:26.335 ] 00:21:26.335 } 00:21:26.335 } 00:21:26.335 }' 00:21:26.335 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:26.335 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:26.335 BaseBdev2 00:21:26.335 BaseBdev3 00:21:26.335 BaseBdev4' 00:21:26.335 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.335 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:26.335 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.899 "name": "NewBaseBdev", 00:21:26.899 "aliases": [ 00:21:26.899 "68cf3c0e-8746-4907-9459-e7b023649b5b" 00:21:26.899 ], 00:21:26.899 "product_name": "Malloc disk", 00:21:26.899 "block_size": 512, 00:21:26.899 "num_blocks": 65536, 00:21:26.899 "uuid": "68cf3c0e-8746-4907-9459-e7b023649b5b", 00:21:26.899 "assigned_rate_limits": { 00:21:26.899 "rw_ios_per_sec": 0, 00:21:26.899 "rw_mbytes_per_sec": 0, 00:21:26.899 "r_mbytes_per_sec": 0, 00:21:26.899 "w_mbytes_per_sec": 0 00:21:26.899 }, 00:21:26.899 "claimed": true, 00:21:26.899 "claim_type": "exclusive_write", 00:21:26.899 "zoned": false, 00:21:26.899 "supported_io_types": { 00:21:26.899 "read": true, 00:21:26.899 "write": true, 00:21:26.899 "unmap": true, 00:21:26.899 "flush": true, 00:21:26.899 "reset": true, 00:21:26.899 "nvme_admin": false, 00:21:26.899 "nvme_io": false, 00:21:26.899 "nvme_io_md": false, 00:21:26.899 "write_zeroes": true, 00:21:26.899 "zcopy": true, 00:21:26.899 "get_zone_info": false, 00:21:26.899 "zone_management": false, 00:21:26.899 "zone_append": false, 00:21:26.899 "compare": false, 00:21:26.899 "compare_and_write": false, 00:21:26.899 "abort": true, 00:21:26.899 "seek_hole": false, 00:21:26.899 "seek_data": false, 00:21:26.899 "copy": true, 00:21:26.899 "nvme_iov_md": false 00:21:26.899 }, 00:21:26.899 "memory_domains": [ 00:21:26.899 { 00:21:26.899 "dma_device_id": "system", 00:21:26.899 "dma_device_type": 1 00:21:26.899 }, 00:21:26.899 { 00:21:26.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.899 "dma_device_type": 2 00:21:26.899 } 00:21:26.899 ], 00:21:26.899 "driver_specific": {} 00:21:26.899 }' 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.899 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.157 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.157 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.157 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.157 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:27.157 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.414 "name": "BaseBdev2", 00:21:27.414 "aliases": [ 00:21:27.414 "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62" 00:21:27.414 ], 00:21:27.414 "product_name": "Malloc disk", 00:21:27.414 "block_size": 512, 00:21:27.414 "num_blocks": 65536, 00:21:27.414 "uuid": "219765a3-0cf1-4ac1-98a3-6b1bba9d4a62", 00:21:27.414 "assigned_rate_limits": { 00:21:27.414 "rw_ios_per_sec": 0, 00:21:27.414 "rw_mbytes_per_sec": 0, 00:21:27.414 "r_mbytes_per_sec": 0, 00:21:27.414 "w_mbytes_per_sec": 0 00:21:27.414 }, 00:21:27.414 "claimed": true, 00:21:27.414 "claim_type": "exclusive_write", 00:21:27.414 "zoned": false, 00:21:27.414 "supported_io_types": { 00:21:27.414 "read": true, 00:21:27.414 "write": true, 00:21:27.414 "unmap": true, 00:21:27.414 "flush": true, 00:21:27.414 "reset": true, 00:21:27.414 "nvme_admin": false, 00:21:27.414 "nvme_io": false, 00:21:27.414 "nvme_io_md": false, 00:21:27.414 "write_zeroes": true, 00:21:27.414 "zcopy": true, 00:21:27.414 "get_zone_info": false, 00:21:27.414 "zone_management": false, 00:21:27.414 "zone_append": false, 00:21:27.414 "compare": false, 00:21:27.414 "compare_and_write": false, 00:21:27.414 "abort": true, 00:21:27.414 "seek_hole": false, 00:21:27.414 "seek_data": false, 00:21:27.414 "copy": true, 00:21:27.414 "nvme_iov_md": false 00:21:27.414 }, 00:21:27.414 "memory_domains": [ 00:21:27.414 { 00:21:27.414 "dma_device_id": "system", 00:21:27.414 "dma_device_type": 1 00:21:27.414 }, 00:21:27.414 { 00:21:27.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.414 "dma_device_type": 2 00:21:27.414 } 00:21:27.414 ], 00:21:27.414 "driver_specific": {} 00:21:27.414 }' 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.414 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:27.672 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:27.931 "name": "BaseBdev3", 00:21:27.931 "aliases": [ 00:21:27.931 "7dad5a71-c7ea-4593-81b4-63f1b70b2403" 00:21:27.931 ], 00:21:27.931 "product_name": "Malloc disk", 00:21:27.931 "block_size": 512, 00:21:27.931 "num_blocks": 65536, 00:21:27.931 "uuid": "7dad5a71-c7ea-4593-81b4-63f1b70b2403", 00:21:27.931 "assigned_rate_limits": { 00:21:27.931 "rw_ios_per_sec": 0, 00:21:27.931 "rw_mbytes_per_sec": 0, 00:21:27.931 "r_mbytes_per_sec": 0, 00:21:27.931 "w_mbytes_per_sec": 0 00:21:27.931 }, 00:21:27.931 "claimed": true, 00:21:27.931 "claim_type": "exclusive_write", 00:21:27.931 "zoned": false, 00:21:27.931 "supported_io_types": { 00:21:27.931 "read": true, 00:21:27.931 "write": true, 00:21:27.931 "unmap": true, 00:21:27.931 "flush": true, 00:21:27.931 "reset": true, 00:21:27.931 "nvme_admin": false, 00:21:27.931 "nvme_io": false, 00:21:27.931 "nvme_io_md": false, 00:21:27.931 "write_zeroes": true, 00:21:27.931 "zcopy": true, 00:21:27.931 "get_zone_info": false, 00:21:27.931 "zone_management": false, 00:21:27.931 "zone_append": false, 00:21:27.931 "compare": false, 00:21:27.931 "compare_and_write": false, 00:21:27.931 "abort": true, 00:21:27.931 "seek_hole": false, 00:21:27.931 "seek_data": false, 00:21:27.931 "copy": true, 00:21:27.931 "nvme_iov_md": false 00:21:27.931 }, 00:21:27.931 "memory_domains": [ 00:21:27.931 { 00:21:27.931 "dma_device_id": "system", 00:21:27.931 "dma_device_type": 1 00:21:27.931 }, 00:21:27.931 { 00:21:27.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.931 "dma_device_type": 2 00:21:27.931 } 00:21:27.931 ], 00:21:27.931 "driver_specific": {} 00:21:27.931 }' 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:27.931 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:28.189 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:28.448 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:28.448 "name": "BaseBdev4", 00:21:28.448 "aliases": [ 00:21:28.448 "d03daf1e-6102-4a34-a0f7-31c113db2aa2" 00:21:28.448 ], 00:21:28.448 "product_name": "Malloc disk", 00:21:28.448 "block_size": 512, 00:21:28.448 "num_blocks": 65536, 00:21:28.448 "uuid": "d03daf1e-6102-4a34-a0f7-31c113db2aa2", 00:21:28.448 "assigned_rate_limits": { 00:21:28.448 "rw_ios_per_sec": 0, 00:21:28.448 "rw_mbytes_per_sec": 0, 00:21:28.448 "r_mbytes_per_sec": 0, 00:21:28.448 "w_mbytes_per_sec": 0 00:21:28.448 }, 00:21:28.448 "claimed": true, 00:21:28.448 "claim_type": "exclusive_write", 00:21:28.448 "zoned": false, 00:21:28.448 "supported_io_types": { 00:21:28.448 "read": true, 00:21:28.448 "write": true, 00:21:28.448 "unmap": true, 00:21:28.448 "flush": true, 00:21:28.448 "reset": true, 00:21:28.448 "nvme_admin": false, 00:21:28.448 "nvme_io": false, 00:21:28.448 "nvme_io_md": false, 00:21:28.448 "write_zeroes": true, 00:21:28.448 "zcopy": true, 00:21:28.448 "get_zone_info": false, 00:21:28.448 "zone_management": false, 00:21:28.448 "zone_append": false, 00:21:28.448 "compare": false, 00:21:28.448 "compare_and_write": false, 00:21:28.448 "abort": true, 00:21:28.448 "seek_hole": false, 00:21:28.448 "seek_data": false, 00:21:28.448 "copy": true, 00:21:28.448 "nvme_iov_md": false 00:21:28.448 }, 00:21:28.448 "memory_domains": [ 00:21:28.448 { 00:21:28.448 "dma_device_id": "system", 00:21:28.448 "dma_device_type": 1 00:21:28.448 }, 00:21:28.448 { 00:21:28.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.448 "dma_device_type": 2 00:21:28.448 } 00:21:28.448 ], 00:21:28.448 "driver_specific": {} 00:21:28.448 }' 00:21:28.448 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.448 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:28.448 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:28.448 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:28.706 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.706 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:28.706 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:28.706 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:28.965 [2024-07-15 13:40:08.235218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:28.965 [2024-07-15 13:40:08.235242] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:28.965 [2024-07-15 13:40:08.235292] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:28.965 [2024-07-15 13:40:08.235575] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:28.965 [2024-07-15 13:40:08.235588] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb5180 name Existed_Raid, state offline 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2163390 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2163390 ']' 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2163390 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2163390 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2163390' 00:21:28.965 killing process with pid 2163390 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2163390 00:21:28.965 [2024-07-15 13:40:08.303359] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:28.965 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2163390 00:21:28.965 [2024-07-15 13:40:08.340121] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:29.224 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:29.224 00:21:29.224 real 0m32.785s 00:21:29.224 user 1m0.244s 00:21:29.224 sys 0m5.770s 00:21:29.224 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:29.224 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:29.224 ************************************ 00:21:29.224 END TEST raid_state_function_test_sb 00:21:29.224 ************************************ 00:21:29.224 13:40:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:29.224 13:40:08 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:29.224 13:40:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:29.224 13:40:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:29.224 13:40:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:29.224 ************************************ 00:21:29.224 START TEST raid_superblock_test 00:21:29.224 ************************************ 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:29.224 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2168305 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2168305 /var/tmp/spdk-raid.sock 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2168305 ']' 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:29.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:29.482 13:40:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.482 [2024-07-15 13:40:08.702074] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:21:29.482 [2024-07-15 13:40:08.702129] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2168305 ] 00:21:29.482 [2024-07-15 13:40:08.813070] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:29.741 [2024-07-15 13:40:08.918555] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:29.741 [2024-07-15 13:40:08.973278] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:29.741 [2024-07-15 13:40:08.973304] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:30.306 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:30.562 malloc1 00:21:30.562 13:40:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:30.820 [2024-07-15 13:40:10.064890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:30.820 [2024-07-15 13:40:10.064958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.820 [2024-07-15 13:40:10.064978] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcf570 00:21:30.820 [2024-07-15 13:40:10.064991] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.820 [2024-07-15 13:40:10.066734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.820 [2024-07-15 13:40:10.066771] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:30.820 pt1 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:30.820 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:31.077 malloc2 00:21:31.077 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:31.334 [2024-07-15 13:40:10.563020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:31.334 [2024-07-15 13:40:10.563067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.334 [2024-07-15 13:40:10.563085] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd0970 00:21:31.334 [2024-07-15 13:40:10.563098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.334 [2024-07-15 13:40:10.564609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.334 [2024-07-15 13:40:10.564636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:31.334 pt2 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:31.334 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:31.591 malloc3 00:21:31.591 13:40:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:31.848 [2024-07-15 13:40:11.048846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:31.848 [2024-07-15 13:40:11.048893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.848 [2024-07-15 13:40:11.048909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf67340 00:21:31.848 [2024-07-15 13:40:11.048922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.848 [2024-07-15 13:40:11.050306] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.848 [2024-07-15 13:40:11.050332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:31.848 pt3 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:31.848 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:32.106 malloc4 00:21:32.106 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:32.364 [2024-07-15 13:40:11.554775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:32.364 [2024-07-15 13:40:11.554821] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:32.364 [2024-07-15 13:40:11.554842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf69c60 00:21:32.364 [2024-07-15 13:40:11.554855] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:32.364 [2024-07-15 13:40:11.556290] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:32.364 [2024-07-15 13:40:11.556318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:32.364 pt4 00:21:32.364 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:32.364 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:32.364 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:32.621 [2024-07-15 13:40:11.795425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:32.621 [2024-07-15 13:40:11.796592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:32.621 [2024-07-15 13:40:11.796647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:32.621 [2024-07-15 13:40:11.796690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:32.621 [2024-07-15 13:40:11.796859] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc7530 00:21:32.621 [2024-07-15 13:40:11.796870] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:32.621 [2024-07-15 13:40:11.797061] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdc5770 00:21:32.621 [2024-07-15 13:40:11.797209] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc7530 00:21:32.621 [2024-07-15 13:40:11.797219] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc7530 00:21:32.621 [2024-07-15 13:40:11.797308] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.621 13:40:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.879 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.879 "name": "raid_bdev1", 00:21:32.879 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:32.879 "strip_size_kb": 0, 00:21:32.879 "state": "online", 00:21:32.879 "raid_level": "raid1", 00:21:32.879 "superblock": true, 00:21:32.879 "num_base_bdevs": 4, 00:21:32.879 "num_base_bdevs_discovered": 4, 00:21:32.879 "num_base_bdevs_operational": 4, 00:21:32.879 "base_bdevs_list": [ 00:21:32.879 { 00:21:32.879 "name": "pt1", 00:21:32.879 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:32.879 "is_configured": true, 00:21:32.879 "data_offset": 2048, 00:21:32.879 "data_size": 63488 00:21:32.879 }, 00:21:32.879 { 00:21:32.879 "name": "pt2", 00:21:32.879 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:32.879 "is_configured": true, 00:21:32.879 "data_offset": 2048, 00:21:32.879 "data_size": 63488 00:21:32.879 }, 00:21:32.879 { 00:21:32.879 "name": "pt3", 00:21:32.879 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:32.879 "is_configured": true, 00:21:32.879 "data_offset": 2048, 00:21:32.879 "data_size": 63488 00:21:32.879 }, 00:21:32.879 { 00:21:32.879 "name": "pt4", 00:21:32.879 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:32.879 "is_configured": true, 00:21:32.879 "data_offset": 2048, 00:21:32.879 "data_size": 63488 00:21:32.879 } 00:21:32.879 ] 00:21:32.879 }' 00:21:32.879 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.879 13:40:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:33.445 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:33.446 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:33.446 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:33.744 [2024-07-15 13:40:12.894622] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:33.744 "name": "raid_bdev1", 00:21:33.744 "aliases": [ 00:21:33.744 "d64b956b-2db4-44fe-aaec-ed9773519438" 00:21:33.744 ], 00:21:33.744 "product_name": "Raid Volume", 00:21:33.744 "block_size": 512, 00:21:33.744 "num_blocks": 63488, 00:21:33.744 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:33.744 "assigned_rate_limits": { 00:21:33.744 "rw_ios_per_sec": 0, 00:21:33.744 "rw_mbytes_per_sec": 0, 00:21:33.744 "r_mbytes_per_sec": 0, 00:21:33.744 "w_mbytes_per_sec": 0 00:21:33.744 }, 00:21:33.744 "claimed": false, 00:21:33.744 "zoned": false, 00:21:33.744 "supported_io_types": { 00:21:33.744 "read": true, 00:21:33.744 "write": true, 00:21:33.744 "unmap": false, 00:21:33.744 "flush": false, 00:21:33.744 "reset": true, 00:21:33.744 "nvme_admin": false, 00:21:33.744 "nvme_io": false, 00:21:33.744 "nvme_io_md": false, 00:21:33.744 "write_zeroes": true, 00:21:33.744 "zcopy": false, 00:21:33.744 "get_zone_info": false, 00:21:33.744 "zone_management": false, 00:21:33.744 "zone_append": false, 00:21:33.744 "compare": false, 00:21:33.744 "compare_and_write": false, 00:21:33.744 "abort": false, 00:21:33.744 "seek_hole": false, 00:21:33.744 "seek_data": false, 00:21:33.744 "copy": false, 00:21:33.744 "nvme_iov_md": false 00:21:33.744 }, 00:21:33.744 "memory_domains": [ 00:21:33.744 { 00:21:33.744 "dma_device_id": "system", 00:21:33.744 "dma_device_type": 1 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.744 "dma_device_type": 2 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "system", 00:21:33.744 "dma_device_type": 1 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.744 "dma_device_type": 2 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "system", 00:21:33.744 "dma_device_type": 1 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.744 "dma_device_type": 2 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "system", 00:21:33.744 "dma_device_type": 1 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.744 "dma_device_type": 2 00:21:33.744 } 00:21:33.744 ], 00:21:33.744 "driver_specific": { 00:21:33.744 "raid": { 00:21:33.744 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:33.744 "strip_size_kb": 0, 00:21:33.744 "state": "online", 00:21:33.744 "raid_level": "raid1", 00:21:33.744 "superblock": true, 00:21:33.744 "num_base_bdevs": 4, 00:21:33.744 "num_base_bdevs_discovered": 4, 00:21:33.744 "num_base_bdevs_operational": 4, 00:21:33.744 "base_bdevs_list": [ 00:21:33.744 { 00:21:33.744 "name": "pt1", 00:21:33.744 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:33.744 "is_configured": true, 00:21:33.744 "data_offset": 2048, 00:21:33.744 "data_size": 63488 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "name": "pt2", 00:21:33.744 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:33.744 "is_configured": true, 00:21:33.744 "data_offset": 2048, 00:21:33.744 "data_size": 63488 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "name": "pt3", 00:21:33.744 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:33.744 "is_configured": true, 00:21:33.744 "data_offset": 2048, 00:21:33.744 "data_size": 63488 00:21:33.744 }, 00:21:33.744 { 00:21:33.744 "name": "pt4", 00:21:33.744 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:33.744 "is_configured": true, 00:21:33.744 "data_offset": 2048, 00:21:33.744 "data_size": 63488 00:21:33.744 } 00:21:33.744 ] 00:21:33.744 } 00:21:33.744 } 00:21:33.744 }' 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:33.744 pt2 00:21:33.744 pt3 00:21:33.744 pt4' 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:33.744 13:40:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:34.001 "name": "pt1", 00:21:34.001 "aliases": [ 00:21:34.001 "00000000-0000-0000-0000-000000000001" 00:21:34.001 ], 00:21:34.001 "product_name": "passthru", 00:21:34.001 "block_size": 512, 00:21:34.001 "num_blocks": 65536, 00:21:34.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:34.001 "assigned_rate_limits": { 00:21:34.001 "rw_ios_per_sec": 0, 00:21:34.001 "rw_mbytes_per_sec": 0, 00:21:34.001 "r_mbytes_per_sec": 0, 00:21:34.001 "w_mbytes_per_sec": 0 00:21:34.001 }, 00:21:34.001 "claimed": true, 00:21:34.001 "claim_type": "exclusive_write", 00:21:34.001 "zoned": false, 00:21:34.001 "supported_io_types": { 00:21:34.001 "read": true, 00:21:34.001 "write": true, 00:21:34.001 "unmap": true, 00:21:34.001 "flush": true, 00:21:34.001 "reset": true, 00:21:34.001 "nvme_admin": false, 00:21:34.001 "nvme_io": false, 00:21:34.001 "nvme_io_md": false, 00:21:34.001 "write_zeroes": true, 00:21:34.001 "zcopy": true, 00:21:34.001 "get_zone_info": false, 00:21:34.001 "zone_management": false, 00:21:34.001 "zone_append": false, 00:21:34.001 "compare": false, 00:21:34.001 "compare_and_write": false, 00:21:34.001 "abort": true, 00:21:34.001 "seek_hole": false, 00:21:34.001 "seek_data": false, 00:21:34.001 "copy": true, 00:21:34.001 "nvme_iov_md": false 00:21:34.001 }, 00:21:34.001 "memory_domains": [ 00:21:34.001 { 00:21:34.001 "dma_device_id": "system", 00:21:34.001 "dma_device_type": 1 00:21:34.001 }, 00:21:34.001 { 00:21:34.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.001 "dma_device_type": 2 00:21:34.001 } 00:21:34.001 ], 00:21:34.001 "driver_specific": { 00:21:34.001 "passthru": { 00:21:34.001 "name": "pt1", 00:21:34.001 "base_bdev_name": "malloc1" 00:21:34.001 } 00:21:34.001 } 00:21:34.001 }' 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.001 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:34.259 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:34.517 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:34.517 "name": "pt2", 00:21:34.517 "aliases": [ 00:21:34.517 "00000000-0000-0000-0000-000000000002" 00:21:34.517 ], 00:21:34.517 "product_name": "passthru", 00:21:34.517 "block_size": 512, 00:21:34.517 "num_blocks": 65536, 00:21:34.517 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:34.517 "assigned_rate_limits": { 00:21:34.517 "rw_ios_per_sec": 0, 00:21:34.517 "rw_mbytes_per_sec": 0, 00:21:34.517 "r_mbytes_per_sec": 0, 00:21:34.517 "w_mbytes_per_sec": 0 00:21:34.517 }, 00:21:34.517 "claimed": true, 00:21:34.517 "claim_type": "exclusive_write", 00:21:34.517 "zoned": false, 00:21:34.517 "supported_io_types": { 00:21:34.517 "read": true, 00:21:34.517 "write": true, 00:21:34.517 "unmap": true, 00:21:34.517 "flush": true, 00:21:34.517 "reset": true, 00:21:34.517 "nvme_admin": false, 00:21:34.517 "nvme_io": false, 00:21:34.517 "nvme_io_md": false, 00:21:34.517 "write_zeroes": true, 00:21:34.517 "zcopy": true, 00:21:34.517 "get_zone_info": false, 00:21:34.517 "zone_management": false, 00:21:34.517 "zone_append": false, 00:21:34.517 "compare": false, 00:21:34.517 "compare_and_write": false, 00:21:34.517 "abort": true, 00:21:34.517 "seek_hole": false, 00:21:34.517 "seek_data": false, 00:21:34.517 "copy": true, 00:21:34.517 "nvme_iov_md": false 00:21:34.517 }, 00:21:34.517 "memory_domains": [ 00:21:34.517 { 00:21:34.517 "dma_device_id": "system", 00:21:34.517 "dma_device_type": 1 00:21:34.517 }, 00:21:34.517 { 00:21:34.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.517 "dma_device_type": 2 00:21:34.517 } 00:21:34.517 ], 00:21:34.517 "driver_specific": { 00:21:34.517 "passthru": { 00:21:34.517 "name": "pt2", 00:21:34.517 "base_bdev_name": "malloc2" 00:21:34.517 } 00:21:34.517 } 00:21:34.517 }' 00:21:34.517 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.517 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.517 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.517 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.775 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.775 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.775 13:40:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:34.775 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.033 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.033 "name": "pt3", 00:21:35.033 "aliases": [ 00:21:35.033 "00000000-0000-0000-0000-000000000003" 00:21:35.033 ], 00:21:35.033 "product_name": "passthru", 00:21:35.033 "block_size": 512, 00:21:35.033 "num_blocks": 65536, 00:21:35.033 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:35.033 "assigned_rate_limits": { 00:21:35.033 "rw_ios_per_sec": 0, 00:21:35.033 "rw_mbytes_per_sec": 0, 00:21:35.033 "r_mbytes_per_sec": 0, 00:21:35.033 "w_mbytes_per_sec": 0 00:21:35.033 }, 00:21:35.033 "claimed": true, 00:21:35.033 "claim_type": "exclusive_write", 00:21:35.033 "zoned": false, 00:21:35.033 "supported_io_types": { 00:21:35.033 "read": true, 00:21:35.033 "write": true, 00:21:35.033 "unmap": true, 00:21:35.033 "flush": true, 00:21:35.033 "reset": true, 00:21:35.033 "nvme_admin": false, 00:21:35.033 "nvme_io": false, 00:21:35.033 "nvme_io_md": false, 00:21:35.033 "write_zeroes": true, 00:21:35.033 "zcopy": true, 00:21:35.033 "get_zone_info": false, 00:21:35.033 "zone_management": false, 00:21:35.033 "zone_append": false, 00:21:35.033 "compare": false, 00:21:35.033 "compare_and_write": false, 00:21:35.033 "abort": true, 00:21:35.033 "seek_hole": false, 00:21:35.033 "seek_data": false, 00:21:35.033 "copy": true, 00:21:35.033 "nvme_iov_md": false 00:21:35.033 }, 00:21:35.033 "memory_domains": [ 00:21:35.033 { 00:21:35.033 "dma_device_id": "system", 00:21:35.033 "dma_device_type": 1 00:21:35.033 }, 00:21:35.033 { 00:21:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.033 "dma_device_type": 2 00:21:35.033 } 00:21:35.033 ], 00:21:35.033 "driver_specific": { 00:21:35.033 "passthru": { 00:21:35.033 "name": "pt3", 00:21:35.033 "base_bdev_name": "malloc3" 00:21:35.033 } 00:21:35.033 } 00:21:35.033 }' 00:21:35.033 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.033 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.291 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.549 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.549 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:35.549 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.549 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:35.549 13:40:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.807 "name": "pt4", 00:21:35.807 "aliases": [ 00:21:35.807 "00000000-0000-0000-0000-000000000004" 00:21:35.807 ], 00:21:35.807 "product_name": "passthru", 00:21:35.807 "block_size": 512, 00:21:35.807 "num_blocks": 65536, 00:21:35.807 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:35.807 "assigned_rate_limits": { 00:21:35.807 "rw_ios_per_sec": 0, 00:21:35.807 "rw_mbytes_per_sec": 0, 00:21:35.807 "r_mbytes_per_sec": 0, 00:21:35.807 "w_mbytes_per_sec": 0 00:21:35.807 }, 00:21:35.807 "claimed": true, 00:21:35.807 "claim_type": "exclusive_write", 00:21:35.807 "zoned": false, 00:21:35.807 "supported_io_types": { 00:21:35.807 "read": true, 00:21:35.807 "write": true, 00:21:35.807 "unmap": true, 00:21:35.807 "flush": true, 00:21:35.807 "reset": true, 00:21:35.807 "nvme_admin": false, 00:21:35.807 "nvme_io": false, 00:21:35.807 "nvme_io_md": false, 00:21:35.807 "write_zeroes": true, 00:21:35.807 "zcopy": true, 00:21:35.807 "get_zone_info": false, 00:21:35.807 "zone_management": false, 00:21:35.807 "zone_append": false, 00:21:35.807 "compare": false, 00:21:35.807 "compare_and_write": false, 00:21:35.807 "abort": true, 00:21:35.807 "seek_hole": false, 00:21:35.807 "seek_data": false, 00:21:35.807 "copy": true, 00:21:35.807 "nvme_iov_md": false 00:21:35.807 }, 00:21:35.807 "memory_domains": [ 00:21:35.807 { 00:21:35.807 "dma_device_id": "system", 00:21:35.807 "dma_device_type": 1 00:21:35.807 }, 00:21:35.807 { 00:21:35.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.807 "dma_device_type": 2 00:21:35.807 } 00:21:35.807 ], 00:21:35.807 "driver_specific": { 00:21:35.807 "passthru": { 00:21:35.807 "name": "pt4", 00:21:35.807 "base_bdev_name": "malloc4" 00:21:35.807 } 00:21:35.807 } 00:21:35.807 }' 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.807 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:36.065 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:36.454 [2024-07-15 13:40:15.581754] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:36.454 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d64b956b-2db4-44fe-aaec-ed9773519438 00:21:36.454 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d64b956b-2db4-44fe-aaec-ed9773519438 ']' 00:21:36.454 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:36.454 [2024-07-15 13:40:15.826105] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:36.454 [2024-07-15 13:40:15.826130] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:36.454 [2024-07-15 13:40:15.826187] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:36.454 [2024-07-15 13:40:15.826277] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:36.454 [2024-07-15 13:40:15.826289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc7530 name raid_bdev1, state offline 00:21:36.454 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.454 13:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:36.710 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:36.710 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:36.710 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:36.710 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:36.967 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:36.967 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:37.224 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:37.224 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:37.480 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:37.481 13:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:37.737 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:37.737 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:37.995 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:38.254 [2024-07-15 13:40:17.538586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:38.254 [2024-07-15 13:40:17.539935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:38.254 [2024-07-15 13:40:17.539981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:38.254 [2024-07-15 13:40:17.540014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:38.254 [2024-07-15 13:40:17.540059] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:38.254 [2024-07-15 13:40:17.540098] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:38.254 [2024-07-15 13:40:17.540121] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:38.254 [2024-07-15 13:40:17.540143] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:38.254 [2024-07-15 13:40:17.540161] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:38.254 [2024-07-15 13:40:17.540171] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf72ff0 name raid_bdev1, state configuring 00:21:38.254 request: 00:21:38.254 { 00:21:38.254 "name": "raid_bdev1", 00:21:38.254 "raid_level": "raid1", 00:21:38.254 "base_bdevs": [ 00:21:38.254 "malloc1", 00:21:38.254 "malloc2", 00:21:38.254 "malloc3", 00:21:38.254 "malloc4" 00:21:38.254 ], 00:21:38.254 "superblock": false, 00:21:38.254 "method": "bdev_raid_create", 00:21:38.254 "req_id": 1 00:21:38.254 } 00:21:38.254 Got JSON-RPC error response 00:21:38.254 response: 00:21:38.254 { 00:21:38.254 "code": -17, 00:21:38.254 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:38.254 } 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.254 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:38.512 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:38.512 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:38.512 13:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:38.771 [2024-07-15 13:40:18.047868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:38.771 [2024-07-15 13:40:18.047919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.771 [2024-07-15 13:40:18.047951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcf7a0 00:21:38.771 [2024-07-15 13:40:18.047965] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.771 [2024-07-15 13:40:18.049639] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.771 [2024-07-15 13:40:18.049669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:38.771 [2024-07-15 13:40:18.049740] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:38.771 [2024-07-15 13:40:18.049769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:38.771 pt1 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.771 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.030 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.030 "name": "raid_bdev1", 00:21:39.030 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:39.030 "strip_size_kb": 0, 00:21:39.030 "state": "configuring", 00:21:39.030 "raid_level": "raid1", 00:21:39.030 "superblock": true, 00:21:39.030 "num_base_bdevs": 4, 00:21:39.030 "num_base_bdevs_discovered": 1, 00:21:39.030 "num_base_bdevs_operational": 4, 00:21:39.030 "base_bdevs_list": [ 00:21:39.030 { 00:21:39.030 "name": "pt1", 00:21:39.030 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:39.030 "is_configured": true, 00:21:39.030 "data_offset": 2048, 00:21:39.030 "data_size": 63488 00:21:39.030 }, 00:21:39.030 { 00:21:39.030 "name": null, 00:21:39.030 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:39.030 "is_configured": false, 00:21:39.030 "data_offset": 2048, 00:21:39.030 "data_size": 63488 00:21:39.030 }, 00:21:39.030 { 00:21:39.030 "name": null, 00:21:39.030 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:39.030 "is_configured": false, 00:21:39.030 "data_offset": 2048, 00:21:39.030 "data_size": 63488 00:21:39.030 }, 00:21:39.030 { 00:21:39.030 "name": null, 00:21:39.030 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:39.030 "is_configured": false, 00:21:39.030 "data_offset": 2048, 00:21:39.030 "data_size": 63488 00:21:39.030 } 00:21:39.030 ] 00:21:39.030 }' 00:21:39.030 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.030 13:40:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.596 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:39.596 13:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:39.854 [2024-07-15 13:40:19.130763] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:39.854 [2024-07-15 13:40:19.130817] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.854 [2024-07-15 13:40:19.130838] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf68940 00:21:39.854 [2024-07-15 13:40:19.130850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.854 [2024-07-15 13:40:19.131215] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.854 [2024-07-15 13:40:19.131233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:39.854 [2024-07-15 13:40:19.131297] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:39.854 [2024-07-15 13:40:19.131316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:39.854 pt2 00:21:39.854 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:40.112 [2024-07-15 13:40:19.375428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.112 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.370 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.371 "name": "raid_bdev1", 00:21:40.371 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:40.371 "strip_size_kb": 0, 00:21:40.371 "state": "configuring", 00:21:40.371 "raid_level": "raid1", 00:21:40.371 "superblock": true, 00:21:40.371 "num_base_bdevs": 4, 00:21:40.371 "num_base_bdevs_discovered": 1, 00:21:40.371 "num_base_bdevs_operational": 4, 00:21:40.371 "base_bdevs_list": [ 00:21:40.371 { 00:21:40.371 "name": "pt1", 00:21:40.371 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:40.371 "is_configured": true, 00:21:40.371 "data_offset": 2048, 00:21:40.371 "data_size": 63488 00:21:40.371 }, 00:21:40.371 { 00:21:40.371 "name": null, 00:21:40.371 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:40.371 "is_configured": false, 00:21:40.371 "data_offset": 2048, 00:21:40.371 "data_size": 63488 00:21:40.371 }, 00:21:40.371 { 00:21:40.371 "name": null, 00:21:40.371 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:40.371 "is_configured": false, 00:21:40.371 "data_offset": 2048, 00:21:40.371 "data_size": 63488 00:21:40.371 }, 00:21:40.371 { 00:21:40.371 "name": null, 00:21:40.371 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:40.371 "is_configured": false, 00:21:40.371 "data_offset": 2048, 00:21:40.371 "data_size": 63488 00:21:40.371 } 00:21:40.371 ] 00:21:40.371 }' 00:21:40.371 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.371 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.937 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:40.937 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:40.937 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:41.195 [2024-07-15 13:40:20.482376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:41.195 [2024-07-15 13:40:20.482432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.195 [2024-07-15 13:40:20.482452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc6060 00:21:41.195 [2024-07-15 13:40:20.482465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.195 [2024-07-15 13:40:20.482823] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.195 [2024-07-15 13:40:20.482841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:41.195 [2024-07-15 13:40:20.482905] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:41.195 [2024-07-15 13:40:20.482924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:41.195 pt2 00:21:41.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:41.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:41.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:41.453 [2024-07-15 13:40:20.723025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:41.453 [2024-07-15 13:40:20.723070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.453 [2024-07-15 13:40:20.723091] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc88d0 00:21:41.453 [2024-07-15 13:40:20.723103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.453 [2024-07-15 13:40:20.723425] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.453 [2024-07-15 13:40:20.723442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:41.453 [2024-07-15 13:40:20.723501] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:41.453 [2024-07-15 13:40:20.723520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:41.453 pt3 00:21:41.453 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:41.453 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:41.453 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:41.712 [2024-07-15 13:40:20.971659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:41.712 [2024-07-15 13:40:20.971703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.712 [2024-07-15 13:40:20.971722] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc9b80 00:21:41.712 [2024-07-15 13:40:20.971734] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.712 [2024-07-15 13:40:20.972073] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.712 [2024-07-15 13:40:20.972097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:41.712 [2024-07-15 13:40:20.972157] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:41.712 [2024-07-15 13:40:20.972177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:41.712 [2024-07-15 13:40:20.972302] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc6780 00:21:41.712 [2024-07-15 13:40:20.972312] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:41.712 [2024-07-15 13:40:20.972487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcbfa0 00:21:41.712 [2024-07-15 13:40:20.972624] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc6780 00:21:41.712 [2024-07-15 13:40:20.972634] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc6780 00:21:41.712 [2024-07-15 13:40:20.972735] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:41.712 pt4 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.712 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.970 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.970 "name": "raid_bdev1", 00:21:41.970 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:41.970 "strip_size_kb": 0, 00:21:41.970 "state": "online", 00:21:41.970 "raid_level": "raid1", 00:21:41.970 "superblock": true, 00:21:41.970 "num_base_bdevs": 4, 00:21:41.970 "num_base_bdevs_discovered": 4, 00:21:41.970 "num_base_bdevs_operational": 4, 00:21:41.970 "base_bdevs_list": [ 00:21:41.970 { 00:21:41.970 "name": "pt1", 00:21:41.970 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:41.970 "is_configured": true, 00:21:41.970 "data_offset": 2048, 00:21:41.970 "data_size": 63488 00:21:41.970 }, 00:21:41.970 { 00:21:41.970 "name": "pt2", 00:21:41.970 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:41.970 "is_configured": true, 00:21:41.970 "data_offset": 2048, 00:21:41.970 "data_size": 63488 00:21:41.970 }, 00:21:41.970 { 00:21:41.970 "name": "pt3", 00:21:41.970 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:41.970 "is_configured": true, 00:21:41.970 "data_offset": 2048, 00:21:41.970 "data_size": 63488 00:21:41.970 }, 00:21:41.970 { 00:21:41.970 "name": "pt4", 00:21:41.970 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:41.970 "is_configured": true, 00:21:41.970 "data_offset": 2048, 00:21:41.970 "data_size": 63488 00:21:41.970 } 00:21:41.970 ] 00:21:41.970 }' 00:21:41.970 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.970 13:40:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:42.536 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:42.794 [2024-07-15 13:40:22.070894] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:42.794 "name": "raid_bdev1", 00:21:42.794 "aliases": [ 00:21:42.794 "d64b956b-2db4-44fe-aaec-ed9773519438" 00:21:42.794 ], 00:21:42.794 "product_name": "Raid Volume", 00:21:42.794 "block_size": 512, 00:21:42.794 "num_blocks": 63488, 00:21:42.794 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:42.794 "assigned_rate_limits": { 00:21:42.794 "rw_ios_per_sec": 0, 00:21:42.794 "rw_mbytes_per_sec": 0, 00:21:42.794 "r_mbytes_per_sec": 0, 00:21:42.794 "w_mbytes_per_sec": 0 00:21:42.794 }, 00:21:42.794 "claimed": false, 00:21:42.794 "zoned": false, 00:21:42.794 "supported_io_types": { 00:21:42.794 "read": true, 00:21:42.794 "write": true, 00:21:42.794 "unmap": false, 00:21:42.794 "flush": false, 00:21:42.794 "reset": true, 00:21:42.794 "nvme_admin": false, 00:21:42.794 "nvme_io": false, 00:21:42.794 "nvme_io_md": false, 00:21:42.794 "write_zeroes": true, 00:21:42.794 "zcopy": false, 00:21:42.794 "get_zone_info": false, 00:21:42.794 "zone_management": false, 00:21:42.794 "zone_append": false, 00:21:42.794 "compare": false, 00:21:42.794 "compare_and_write": false, 00:21:42.794 "abort": false, 00:21:42.794 "seek_hole": false, 00:21:42.794 "seek_data": false, 00:21:42.794 "copy": false, 00:21:42.794 "nvme_iov_md": false 00:21:42.794 }, 00:21:42.794 "memory_domains": [ 00:21:42.794 { 00:21:42.794 "dma_device_id": "system", 00:21:42.794 "dma_device_type": 1 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.794 "dma_device_type": 2 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "system", 00:21:42.794 "dma_device_type": 1 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.794 "dma_device_type": 2 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "system", 00:21:42.794 "dma_device_type": 1 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.794 "dma_device_type": 2 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "system", 00:21:42.794 "dma_device_type": 1 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.794 "dma_device_type": 2 00:21:42.794 } 00:21:42.794 ], 00:21:42.794 "driver_specific": { 00:21:42.794 "raid": { 00:21:42.794 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:42.794 "strip_size_kb": 0, 00:21:42.794 "state": "online", 00:21:42.794 "raid_level": "raid1", 00:21:42.794 "superblock": true, 00:21:42.794 "num_base_bdevs": 4, 00:21:42.794 "num_base_bdevs_discovered": 4, 00:21:42.794 "num_base_bdevs_operational": 4, 00:21:42.794 "base_bdevs_list": [ 00:21:42.794 { 00:21:42.794 "name": "pt1", 00:21:42.794 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:42.794 "is_configured": true, 00:21:42.794 "data_offset": 2048, 00:21:42.794 "data_size": 63488 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "name": "pt2", 00:21:42.794 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:42.794 "is_configured": true, 00:21:42.794 "data_offset": 2048, 00:21:42.794 "data_size": 63488 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "name": "pt3", 00:21:42.794 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:42.794 "is_configured": true, 00:21:42.794 "data_offset": 2048, 00:21:42.794 "data_size": 63488 00:21:42.794 }, 00:21:42.794 { 00:21:42.794 "name": "pt4", 00:21:42.794 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:42.794 "is_configured": true, 00:21:42.794 "data_offset": 2048, 00:21:42.794 "data_size": 63488 00:21:42.794 } 00:21:42.794 ] 00:21:42.794 } 00:21:42.794 } 00:21:42.794 }' 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:42.794 pt2 00:21:42.794 pt3 00:21:42.794 pt4' 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:42.794 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:43.053 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:43.053 "name": "pt1", 00:21:43.053 "aliases": [ 00:21:43.053 "00000000-0000-0000-0000-000000000001" 00:21:43.053 ], 00:21:43.053 "product_name": "passthru", 00:21:43.053 "block_size": 512, 00:21:43.053 "num_blocks": 65536, 00:21:43.053 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.053 "assigned_rate_limits": { 00:21:43.053 "rw_ios_per_sec": 0, 00:21:43.053 "rw_mbytes_per_sec": 0, 00:21:43.053 "r_mbytes_per_sec": 0, 00:21:43.053 "w_mbytes_per_sec": 0 00:21:43.053 }, 00:21:43.053 "claimed": true, 00:21:43.053 "claim_type": "exclusive_write", 00:21:43.053 "zoned": false, 00:21:43.053 "supported_io_types": { 00:21:43.053 "read": true, 00:21:43.053 "write": true, 00:21:43.053 "unmap": true, 00:21:43.053 "flush": true, 00:21:43.053 "reset": true, 00:21:43.053 "nvme_admin": false, 00:21:43.053 "nvme_io": false, 00:21:43.053 "nvme_io_md": false, 00:21:43.053 "write_zeroes": true, 00:21:43.053 "zcopy": true, 00:21:43.053 "get_zone_info": false, 00:21:43.053 "zone_management": false, 00:21:43.053 "zone_append": false, 00:21:43.053 "compare": false, 00:21:43.053 "compare_and_write": false, 00:21:43.053 "abort": true, 00:21:43.053 "seek_hole": false, 00:21:43.053 "seek_data": false, 00:21:43.053 "copy": true, 00:21:43.053 "nvme_iov_md": false 00:21:43.053 }, 00:21:43.053 "memory_domains": [ 00:21:43.053 { 00:21:43.053 "dma_device_id": "system", 00:21:43.053 "dma_device_type": 1 00:21:43.053 }, 00:21:43.053 { 00:21:43.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.053 "dma_device_type": 2 00:21:43.053 } 00:21:43.053 ], 00:21:43.053 "driver_specific": { 00:21:43.053 "passthru": { 00:21:43.053 "name": "pt1", 00:21:43.053 "base_bdev_name": "malloc1" 00:21:43.053 } 00:21:43.053 } 00:21:43.053 }' 00:21:43.053 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.053 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.053 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:43.053 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:43.311 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:43.569 "name": "pt2", 00:21:43.569 "aliases": [ 00:21:43.569 "00000000-0000-0000-0000-000000000002" 00:21:43.569 ], 00:21:43.569 "product_name": "passthru", 00:21:43.569 "block_size": 512, 00:21:43.569 "num_blocks": 65536, 00:21:43.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:43.569 "assigned_rate_limits": { 00:21:43.569 "rw_ios_per_sec": 0, 00:21:43.569 "rw_mbytes_per_sec": 0, 00:21:43.569 "r_mbytes_per_sec": 0, 00:21:43.569 "w_mbytes_per_sec": 0 00:21:43.569 }, 00:21:43.569 "claimed": true, 00:21:43.569 "claim_type": "exclusive_write", 00:21:43.569 "zoned": false, 00:21:43.569 "supported_io_types": { 00:21:43.569 "read": true, 00:21:43.569 "write": true, 00:21:43.569 "unmap": true, 00:21:43.569 "flush": true, 00:21:43.569 "reset": true, 00:21:43.569 "nvme_admin": false, 00:21:43.569 "nvme_io": false, 00:21:43.569 "nvme_io_md": false, 00:21:43.569 "write_zeroes": true, 00:21:43.569 "zcopy": true, 00:21:43.569 "get_zone_info": false, 00:21:43.569 "zone_management": false, 00:21:43.569 "zone_append": false, 00:21:43.569 "compare": false, 00:21:43.569 "compare_and_write": false, 00:21:43.569 "abort": true, 00:21:43.569 "seek_hole": false, 00:21:43.569 "seek_data": false, 00:21:43.569 "copy": true, 00:21:43.569 "nvme_iov_md": false 00:21:43.569 }, 00:21:43.569 "memory_domains": [ 00:21:43.569 { 00:21:43.569 "dma_device_id": "system", 00:21:43.569 "dma_device_type": 1 00:21:43.569 }, 00:21:43.569 { 00:21:43.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.569 "dma_device_type": 2 00:21:43.569 } 00:21:43.569 ], 00:21:43.569 "driver_specific": { 00:21:43.569 "passthru": { 00:21:43.569 "name": "pt2", 00:21:43.569 "base_bdev_name": "malloc2" 00:21:43.569 } 00:21:43.569 } 00:21:43.569 }' 00:21:43.569 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:43.828 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.085 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.085 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:44.085 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.085 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:44.085 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.342 "name": "pt3", 00:21:44.342 "aliases": [ 00:21:44.342 "00000000-0000-0000-0000-000000000003" 00:21:44.342 ], 00:21:44.342 "product_name": "passthru", 00:21:44.342 "block_size": 512, 00:21:44.342 "num_blocks": 65536, 00:21:44.342 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:44.342 "assigned_rate_limits": { 00:21:44.342 "rw_ios_per_sec": 0, 00:21:44.342 "rw_mbytes_per_sec": 0, 00:21:44.342 "r_mbytes_per_sec": 0, 00:21:44.342 "w_mbytes_per_sec": 0 00:21:44.342 }, 00:21:44.342 "claimed": true, 00:21:44.342 "claim_type": "exclusive_write", 00:21:44.342 "zoned": false, 00:21:44.342 "supported_io_types": { 00:21:44.342 "read": true, 00:21:44.342 "write": true, 00:21:44.342 "unmap": true, 00:21:44.342 "flush": true, 00:21:44.342 "reset": true, 00:21:44.342 "nvme_admin": false, 00:21:44.342 "nvme_io": false, 00:21:44.342 "nvme_io_md": false, 00:21:44.342 "write_zeroes": true, 00:21:44.342 "zcopy": true, 00:21:44.342 "get_zone_info": false, 00:21:44.342 "zone_management": false, 00:21:44.342 "zone_append": false, 00:21:44.342 "compare": false, 00:21:44.342 "compare_and_write": false, 00:21:44.342 "abort": true, 00:21:44.342 "seek_hole": false, 00:21:44.342 "seek_data": false, 00:21:44.342 "copy": true, 00:21:44.342 "nvme_iov_md": false 00:21:44.342 }, 00:21:44.342 "memory_domains": [ 00:21:44.342 { 00:21:44.342 "dma_device_id": "system", 00:21:44.342 "dma_device_type": 1 00:21:44.342 }, 00:21:44.342 { 00:21:44.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.342 "dma_device_type": 2 00:21:44.342 } 00:21:44.342 ], 00:21:44.342 "driver_specific": { 00:21:44.342 "passthru": { 00:21:44.342 "name": "pt3", 00:21:44.342 "base_bdev_name": "malloc3" 00:21:44.342 } 00:21:44.342 } 00:21:44.342 }' 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:44.342 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:44.600 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.858 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.858 "name": "pt4", 00:21:44.858 "aliases": [ 00:21:44.858 "00000000-0000-0000-0000-000000000004" 00:21:44.858 ], 00:21:44.858 "product_name": "passthru", 00:21:44.858 "block_size": 512, 00:21:44.858 "num_blocks": 65536, 00:21:44.858 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:44.858 "assigned_rate_limits": { 00:21:44.858 "rw_ios_per_sec": 0, 00:21:44.858 "rw_mbytes_per_sec": 0, 00:21:44.858 "r_mbytes_per_sec": 0, 00:21:44.858 "w_mbytes_per_sec": 0 00:21:44.858 }, 00:21:44.858 "claimed": true, 00:21:44.858 "claim_type": "exclusive_write", 00:21:44.858 "zoned": false, 00:21:44.858 "supported_io_types": { 00:21:44.858 "read": true, 00:21:44.858 "write": true, 00:21:44.858 "unmap": true, 00:21:44.858 "flush": true, 00:21:44.858 "reset": true, 00:21:44.858 "nvme_admin": false, 00:21:44.858 "nvme_io": false, 00:21:44.858 "nvme_io_md": false, 00:21:44.858 "write_zeroes": true, 00:21:44.858 "zcopy": true, 00:21:44.858 "get_zone_info": false, 00:21:44.858 "zone_management": false, 00:21:44.858 "zone_append": false, 00:21:44.858 "compare": false, 00:21:44.858 "compare_and_write": false, 00:21:44.858 "abort": true, 00:21:44.858 "seek_hole": false, 00:21:44.858 "seek_data": false, 00:21:44.858 "copy": true, 00:21:44.858 "nvme_iov_md": false 00:21:44.858 }, 00:21:44.858 "memory_domains": [ 00:21:44.858 { 00:21:44.858 "dma_device_id": "system", 00:21:44.858 "dma_device_type": 1 00:21:44.858 }, 00:21:44.858 { 00:21:44.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.858 "dma_device_type": 2 00:21:44.858 } 00:21:44.858 ], 00:21:44.858 "driver_specific": { 00:21:44.858 "passthru": { 00:21:44.858 "name": "pt4", 00:21:44.858 "base_bdev_name": "malloc4" 00:21:44.858 } 00:21:44.858 } 00:21:44.858 }' 00:21:44.858 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.858 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.858 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:44.858 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:45.116 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:45.374 [2024-07-15 13:40:24.738096] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:45.374 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d64b956b-2db4-44fe-aaec-ed9773519438 '!=' d64b956b-2db4-44fe-aaec-ed9773519438 ']' 00:21:45.374 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:45.374 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:45.374 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:45.374 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:45.632 [2024-07-15 13:40:24.986489] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.632 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.890 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.890 "name": "raid_bdev1", 00:21:45.890 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:45.890 "strip_size_kb": 0, 00:21:45.890 "state": "online", 00:21:45.890 "raid_level": "raid1", 00:21:45.890 "superblock": true, 00:21:45.890 "num_base_bdevs": 4, 00:21:45.890 "num_base_bdevs_discovered": 3, 00:21:45.890 "num_base_bdevs_operational": 3, 00:21:45.890 "base_bdevs_list": [ 00:21:45.890 { 00:21:45.890 "name": null, 00:21:45.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.890 "is_configured": false, 00:21:45.890 "data_offset": 2048, 00:21:45.890 "data_size": 63488 00:21:45.890 }, 00:21:45.890 { 00:21:45.890 "name": "pt2", 00:21:45.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:45.890 "is_configured": true, 00:21:45.890 "data_offset": 2048, 00:21:45.890 "data_size": 63488 00:21:45.890 }, 00:21:45.890 { 00:21:45.890 "name": "pt3", 00:21:45.890 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:45.890 "is_configured": true, 00:21:45.890 "data_offset": 2048, 00:21:45.890 "data_size": 63488 00:21:45.890 }, 00:21:45.890 { 00:21:45.890 "name": "pt4", 00:21:45.890 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:45.890 "is_configured": true, 00:21:45.890 "data_offset": 2048, 00:21:45.890 "data_size": 63488 00:21:45.890 } 00:21:45.890 ] 00:21:45.890 }' 00:21:45.890 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.890 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.456 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:46.714 [2024-07-15 13:40:26.009155] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:46.714 [2024-07-15 13:40:26.009184] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:46.714 [2024-07-15 13:40:26.009234] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:46.714 [2024-07-15 13:40:26.009295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:46.714 [2024-07-15 13:40:26.009306] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc6780 name raid_bdev1, state offline 00:21:46.714 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.714 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:46.972 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:47.231 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:47.231 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:47.231 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:47.488 [2024-07-15 13:40:26.887480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:47.488 [2024-07-15 13:40:26.887529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.488 [2024-07-15 13:40:26.887548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf69700 00:21:47.488 [2024-07-15 13:40:26.887560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.488 [2024-07-15 13:40:26.889192] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.488 [2024-07-15 13:40:26.889219] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:47.488 [2024-07-15 13:40:26.889282] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:47.488 [2024-07-15 13:40:26.889309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:47.488 pt2 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.488 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.746 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.746 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.746 "name": "raid_bdev1", 00:21:47.746 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:47.746 "strip_size_kb": 0, 00:21:47.746 "state": "configuring", 00:21:47.746 "raid_level": "raid1", 00:21:47.746 "superblock": true, 00:21:47.746 "num_base_bdevs": 4, 00:21:47.746 "num_base_bdevs_discovered": 1, 00:21:47.746 "num_base_bdevs_operational": 3, 00:21:47.746 "base_bdevs_list": [ 00:21:47.746 { 00:21:47.746 "name": null, 00:21:47.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.746 "is_configured": false, 00:21:47.746 "data_offset": 2048, 00:21:47.746 "data_size": 63488 00:21:47.746 }, 00:21:47.746 { 00:21:47.746 "name": "pt2", 00:21:47.746 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:47.746 "is_configured": true, 00:21:47.746 "data_offset": 2048, 00:21:47.746 "data_size": 63488 00:21:47.746 }, 00:21:47.746 { 00:21:47.746 "name": null, 00:21:47.746 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:47.746 "is_configured": false, 00:21:47.746 "data_offset": 2048, 00:21:47.746 "data_size": 63488 00:21:47.746 }, 00:21:47.746 { 00:21:47.746 "name": null, 00:21:47.746 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:47.746 "is_configured": false, 00:21:47.746 "data_offset": 2048, 00:21:47.746 "data_size": 63488 00:21:47.746 } 00:21:47.746 ] 00:21:47.746 }' 00:21:47.746 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.746 13:40:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.310 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:48.310 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:48.310 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:48.567 [2024-07-15 13:40:27.878120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:48.567 [2024-07-15 13:40:27.878166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:48.567 [2024-07-15 13:40:27.878187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcfa10 00:21:48.567 [2024-07-15 13:40:27.878200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:48.567 [2024-07-15 13:40:27.878540] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:48.567 [2024-07-15 13:40:27.878557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:48.567 [2024-07-15 13:40:27.878616] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:48.567 [2024-07-15 13:40:27.878636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:48.567 pt3 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.567 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.824 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.824 "name": "raid_bdev1", 00:21:48.824 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:48.824 "strip_size_kb": 0, 00:21:48.824 "state": "configuring", 00:21:48.824 "raid_level": "raid1", 00:21:48.824 "superblock": true, 00:21:48.824 "num_base_bdevs": 4, 00:21:48.824 "num_base_bdevs_discovered": 2, 00:21:48.824 "num_base_bdevs_operational": 3, 00:21:48.824 "base_bdevs_list": [ 00:21:48.824 { 00:21:48.824 "name": null, 00:21:48.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.824 "is_configured": false, 00:21:48.824 "data_offset": 2048, 00:21:48.824 "data_size": 63488 00:21:48.824 }, 00:21:48.824 { 00:21:48.824 "name": "pt2", 00:21:48.824 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:48.824 "is_configured": true, 00:21:48.824 "data_offset": 2048, 00:21:48.824 "data_size": 63488 00:21:48.824 }, 00:21:48.824 { 00:21:48.824 "name": "pt3", 00:21:48.824 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:48.824 "is_configured": true, 00:21:48.824 "data_offset": 2048, 00:21:48.824 "data_size": 63488 00:21:48.824 }, 00:21:48.824 { 00:21:48.824 "name": null, 00:21:48.824 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:48.824 "is_configured": false, 00:21:48.824 "data_offset": 2048, 00:21:48.824 "data_size": 63488 00:21:48.824 } 00:21:48.824 ] 00:21:48.824 }' 00:21:48.824 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.824 13:40:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.389 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:49.389 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:49.389 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:21:49.389 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:49.646 [2024-07-15 13:40:28.989046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:49.647 [2024-07-15 13:40:28.989104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.647 [2024-07-15 13:40:28.989123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf72520 00:21:49.647 [2024-07-15 13:40:28.989135] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.647 [2024-07-15 13:40:28.989474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.647 [2024-07-15 13:40:28.989492] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:49.647 [2024-07-15 13:40:28.989551] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:49.647 [2024-07-15 13:40:28.989569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:49.647 [2024-07-15 13:40:28.989683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc6ea0 00:21:49.647 [2024-07-15 13:40:28.989694] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:49.647 [2024-07-15 13:40:28.989859] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcb600 00:21:49.647 [2024-07-15 13:40:28.990003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc6ea0 00:21:49.647 [2024-07-15 13:40:28.990014] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc6ea0 00:21:49.647 [2024-07-15 13:40:28.990111] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:49.647 pt4 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.647 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.904 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.904 "name": "raid_bdev1", 00:21:49.904 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:49.904 "strip_size_kb": 0, 00:21:49.904 "state": "online", 00:21:49.904 "raid_level": "raid1", 00:21:49.904 "superblock": true, 00:21:49.904 "num_base_bdevs": 4, 00:21:49.904 "num_base_bdevs_discovered": 3, 00:21:49.904 "num_base_bdevs_operational": 3, 00:21:49.904 "base_bdevs_list": [ 00:21:49.904 { 00:21:49.904 "name": null, 00:21:49.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.904 "is_configured": false, 00:21:49.904 "data_offset": 2048, 00:21:49.904 "data_size": 63488 00:21:49.904 }, 00:21:49.904 { 00:21:49.904 "name": "pt2", 00:21:49.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:49.904 "is_configured": true, 00:21:49.904 "data_offset": 2048, 00:21:49.904 "data_size": 63488 00:21:49.904 }, 00:21:49.904 { 00:21:49.904 "name": "pt3", 00:21:49.904 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:49.904 "is_configured": true, 00:21:49.904 "data_offset": 2048, 00:21:49.904 "data_size": 63488 00:21:49.904 }, 00:21:49.904 { 00:21:49.904 "name": "pt4", 00:21:49.904 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:49.904 "is_configured": true, 00:21:49.904 "data_offset": 2048, 00:21:49.904 "data_size": 63488 00:21:49.904 } 00:21:49.904 ] 00:21:49.904 }' 00:21:49.904 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.904 13:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.510 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:50.768 [2024-07-15 13:40:30.079918] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:50.768 [2024-07-15 13:40:30.079959] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:50.768 [2024-07-15 13:40:30.080023] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:50.768 [2024-07-15 13:40:30.080093] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:50.768 [2024-07-15 13:40:30.080106] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc6ea0 name raid_bdev1, state offline 00:21:50.768 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.768 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:51.026 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:51.026 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:51.026 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:21:51.026 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:21:51.026 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:51.284 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:51.541 [2024-07-15 13:40:30.761696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:51.541 [2024-07-15 13:40:30.761742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.541 [2024-07-15 13:40:30.761760] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf72520 00:21:51.541 [2024-07-15 13:40:30.761772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.541 [2024-07-15 13:40:30.763390] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.541 [2024-07-15 13:40:30.763417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:51.541 [2024-07-15 13:40:30.763479] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:51.541 [2024-07-15 13:40:30.763506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:51.541 [2024-07-15 13:40:30.763607] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:51.541 [2024-07-15 13:40:30.763620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:51.541 [2024-07-15 13:40:30.763634] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc6060 name raid_bdev1, state configuring 00:21:51.541 [2024-07-15 13:40:30.763657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:51.541 [2024-07-15 13:40:30.763732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:51.541 pt1 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.541 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.798 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.798 "name": "raid_bdev1", 00:21:51.798 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:51.798 "strip_size_kb": 0, 00:21:51.798 "state": "configuring", 00:21:51.798 "raid_level": "raid1", 00:21:51.798 "superblock": true, 00:21:51.798 "num_base_bdevs": 4, 00:21:51.798 "num_base_bdevs_discovered": 2, 00:21:51.798 "num_base_bdevs_operational": 3, 00:21:51.798 "base_bdevs_list": [ 00:21:51.798 { 00:21:51.798 "name": null, 00:21:51.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.798 "is_configured": false, 00:21:51.798 "data_offset": 2048, 00:21:51.798 "data_size": 63488 00:21:51.798 }, 00:21:51.798 { 00:21:51.798 "name": "pt2", 00:21:51.798 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:51.798 "is_configured": true, 00:21:51.798 "data_offset": 2048, 00:21:51.798 "data_size": 63488 00:21:51.798 }, 00:21:51.798 { 00:21:51.798 "name": "pt3", 00:21:51.798 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:51.798 "is_configured": true, 00:21:51.798 "data_offset": 2048, 00:21:51.798 "data_size": 63488 00:21:51.798 }, 00:21:51.798 { 00:21:51.798 "name": null, 00:21:51.798 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:51.798 "is_configured": false, 00:21:51.798 "data_offset": 2048, 00:21:51.798 "data_size": 63488 00:21:51.798 } 00:21:51.798 ] 00:21:51.798 }' 00:21:51.798 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.798 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.364 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:52.364 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:21:52.622 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:21:52.622 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:52.880 [2024-07-15 13:40:32.089244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:52.880 [2024-07-15 13:40:32.089307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.880 [2024-07-15 13:40:32.089329] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc6310 00:21:52.880 [2024-07-15 13:40:32.089349] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.880 [2024-07-15 13:40:32.089724] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.880 [2024-07-15 13:40:32.089743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:52.880 [2024-07-15 13:40:32.089809] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:52.880 [2024-07-15 13:40:32.089831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:52.880 [2024-07-15 13:40:32.089965] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc9b40 00:21:52.880 [2024-07-15 13:40:32.089977] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:52.880 [2024-07-15 13:40:32.090153] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf69990 00:21:52.880 [2024-07-15 13:40:32.090286] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc9b40 00:21:52.880 [2024-07-15 13:40:32.090296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc9b40 00:21:52.880 [2024-07-15 13:40:32.090396] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.880 pt4 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.880 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.137 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.138 "name": "raid_bdev1", 00:21:53.138 "uuid": "d64b956b-2db4-44fe-aaec-ed9773519438", 00:21:53.138 "strip_size_kb": 0, 00:21:53.138 "state": "online", 00:21:53.138 "raid_level": "raid1", 00:21:53.138 "superblock": true, 00:21:53.138 "num_base_bdevs": 4, 00:21:53.138 "num_base_bdevs_discovered": 3, 00:21:53.138 "num_base_bdevs_operational": 3, 00:21:53.138 "base_bdevs_list": [ 00:21:53.138 { 00:21:53.138 "name": null, 00:21:53.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.138 "is_configured": false, 00:21:53.138 "data_offset": 2048, 00:21:53.138 "data_size": 63488 00:21:53.138 }, 00:21:53.138 { 00:21:53.138 "name": "pt2", 00:21:53.138 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.138 "is_configured": true, 00:21:53.138 "data_offset": 2048, 00:21:53.138 "data_size": 63488 00:21:53.138 }, 00:21:53.138 { 00:21:53.138 "name": "pt3", 00:21:53.138 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:53.138 "is_configured": true, 00:21:53.138 "data_offset": 2048, 00:21:53.138 "data_size": 63488 00:21:53.138 }, 00:21:53.138 { 00:21:53.138 "name": "pt4", 00:21:53.138 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:53.138 "is_configured": true, 00:21:53.138 "data_offset": 2048, 00:21:53.138 "data_size": 63488 00:21:53.138 } 00:21:53.138 ] 00:21:53.138 }' 00:21:53.138 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.138 13:40:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.703 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:53.703 13:40:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:53.961 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:53.961 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:53.961 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:54.218 [2024-07-15 13:40:33.425097] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' d64b956b-2db4-44fe-aaec-ed9773519438 '!=' d64b956b-2db4-44fe-aaec-ed9773519438 ']' 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2168305 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2168305 ']' 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2168305 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2168305 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2168305' 00:21:54.218 killing process with pid 2168305 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2168305 00:21:54.218 [2024-07-15 13:40:33.495847] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:54.218 [2024-07-15 13:40:33.495915] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:54.218 [2024-07-15 13:40:33.496000] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:54.218 [2024-07-15 13:40:33.496014] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc9b40 name raid_bdev1, state offline 00:21:54.218 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2168305 00:21:54.218 [2024-07-15 13:40:33.538277] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:54.476 13:40:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:54.476 00:21:54.476 real 0m25.118s 00:21:54.476 user 0m45.897s 00:21:54.476 sys 0m4.521s 00:21:54.476 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:54.476 13:40:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.476 ************************************ 00:21:54.476 END TEST raid_superblock_test 00:21:54.476 ************************************ 00:21:54.476 13:40:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:54.476 13:40:33 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:21:54.476 13:40:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:54.476 13:40:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:54.476 13:40:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:54.476 ************************************ 00:21:54.476 START TEST raid_read_error_test 00:21:54.476 ************************************ 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qDouCfSpGL 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2172003 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2172003 /var/tmp/spdk-raid.sock 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2172003 ']' 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:54.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:54.476 13:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.734 [2024-07-15 13:40:33.915015] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:21:54.734 [2024-07-15 13:40:33.915078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2172003 ] 00:21:54.734 [2024-07-15 13:40:34.035529] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:54.734 [2024-07-15 13:40:34.141811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:54.992 [2024-07-15 13:40:34.204641] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:54.992 [2024-07-15 13:40:34.204670] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:55.557 13:40:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:55.557 13:40:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:55.557 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:55.557 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:55.814 BaseBdev1_malloc 00:21:55.814 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:56.072 true 00:21:56.072 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:56.340 [2024-07-15 13:40:35.530356] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:56.340 [2024-07-15 13:40:35.530402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.340 [2024-07-15 13:40:35.530425] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19530d0 00:21:56.340 [2024-07-15 13:40:35.530437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.340 [2024-07-15 13:40:35.532348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.340 [2024-07-15 13:40:35.532378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:56.340 BaseBdev1 00:21:56.340 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:56.340 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:56.597 BaseBdev2_malloc 00:21:56.597 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:56.855 true 00:21:56.855 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:56.855 [2024-07-15 13:40:36.266134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:56.855 [2024-07-15 13:40:36.266177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.855 [2024-07-15 13:40:36.266198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1957910 00:21:56.855 [2024-07-15 13:40:36.266211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.855 [2024-07-15 13:40:36.267770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.855 [2024-07-15 13:40:36.267798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:56.855 BaseBdev2 00:21:57.112 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:57.112 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:57.112 BaseBdev3_malloc 00:21:57.112 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:57.368 true 00:21:57.368 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:57.626 [2024-07-15 13:40:37.004632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:57.626 [2024-07-15 13:40:37.004678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.626 [2024-07-15 13:40:37.004699] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1959bd0 00:21:57.626 [2024-07-15 13:40:37.004711] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.626 [2024-07-15 13:40:37.006326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.626 [2024-07-15 13:40:37.006354] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:57.626 BaseBdev3 00:21:57.626 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:57.626 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:57.883 BaseBdev4_malloc 00:21:57.883 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:58.140 true 00:21:58.140 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:58.396 [2024-07-15 13:40:37.728326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:58.396 [2024-07-15 13:40:37.728369] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.396 [2024-07-15 13:40:37.728391] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195aaa0 00:21:58.396 [2024-07-15 13:40:37.728403] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.396 [2024-07-15 13:40:37.729978] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.396 [2024-07-15 13:40:37.730005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:58.396 BaseBdev4 00:21:58.396 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:58.652 [2024-07-15 13:40:37.969003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:58.652 [2024-07-15 13:40:37.970332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:58.652 [2024-07-15 13:40:37.970401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:58.652 [2024-07-15 13:40:37.970462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:58.652 [2024-07-15 13:40:37.970697] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1954c20 00:21:58.652 [2024-07-15 13:40:37.970709] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:58.652 [2024-07-15 13:40:37.970902] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a9260 00:21:58.652 [2024-07-15 13:40:37.971071] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1954c20 00:21:58.652 [2024-07-15 13:40:37.971082] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1954c20 00:21:58.652 [2024-07-15 13:40:37.971186] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.652 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.920 13:40:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.920 "name": "raid_bdev1", 00:21:58.920 "uuid": "5f1d5e62-7d04-4592-9ed4-be01fa96535e", 00:21:58.920 "strip_size_kb": 0, 00:21:58.920 "state": "online", 00:21:58.920 "raid_level": "raid1", 00:21:58.920 "superblock": true, 00:21:58.920 "num_base_bdevs": 4, 00:21:58.920 "num_base_bdevs_discovered": 4, 00:21:58.920 "num_base_bdevs_operational": 4, 00:21:58.920 "base_bdevs_list": [ 00:21:58.920 { 00:21:58.920 "name": "BaseBdev1", 00:21:58.920 "uuid": "e4aa6fcb-8b09-521b-bc92-b6895475149a", 00:21:58.920 "is_configured": true, 00:21:58.920 "data_offset": 2048, 00:21:58.920 "data_size": 63488 00:21:58.920 }, 00:21:58.920 { 00:21:58.920 "name": "BaseBdev2", 00:21:58.920 "uuid": "336cc272-c606-5148-a8d2-47b30a0b9947", 00:21:58.920 "is_configured": true, 00:21:58.920 "data_offset": 2048, 00:21:58.920 "data_size": 63488 00:21:58.920 }, 00:21:58.920 { 00:21:58.920 "name": "BaseBdev3", 00:21:58.920 "uuid": "0fe2e192-7b26-5145-bf7f-d601355ef745", 00:21:58.920 "is_configured": true, 00:21:58.920 "data_offset": 2048, 00:21:58.920 "data_size": 63488 00:21:58.920 }, 00:21:58.920 { 00:21:58.920 "name": "BaseBdev4", 00:21:58.920 "uuid": "e3286181-655c-52a4-a239-b04df8f398c7", 00:21:58.920 "is_configured": true, 00:21:58.920 "data_offset": 2048, 00:21:58.920 "data_size": 63488 00:21:58.920 } 00:21:58.920 ] 00:21:58.920 }' 00:21:58.920 13:40:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.920 13:40:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.481 13:40:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:59.481 13:40:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:59.738 [2024-07-15 13:40:38.947872] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a8c60 00:22:00.670 13:40:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:00.670 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.927 "name": "raid_bdev1", 00:22:00.927 "uuid": "5f1d5e62-7d04-4592-9ed4-be01fa96535e", 00:22:00.927 "strip_size_kb": 0, 00:22:00.927 "state": "online", 00:22:00.927 "raid_level": "raid1", 00:22:00.927 "superblock": true, 00:22:00.927 "num_base_bdevs": 4, 00:22:00.927 "num_base_bdevs_discovered": 4, 00:22:00.927 "num_base_bdevs_operational": 4, 00:22:00.927 "base_bdevs_list": [ 00:22:00.927 { 00:22:00.927 "name": "BaseBdev1", 00:22:00.927 "uuid": "e4aa6fcb-8b09-521b-bc92-b6895475149a", 00:22:00.927 "is_configured": true, 00:22:00.927 "data_offset": 2048, 00:22:00.927 "data_size": 63488 00:22:00.927 }, 00:22:00.927 { 00:22:00.927 "name": "BaseBdev2", 00:22:00.927 "uuid": "336cc272-c606-5148-a8d2-47b30a0b9947", 00:22:00.927 "is_configured": true, 00:22:00.927 "data_offset": 2048, 00:22:00.927 "data_size": 63488 00:22:00.927 }, 00:22:00.927 { 00:22:00.927 "name": "BaseBdev3", 00:22:00.927 "uuid": "0fe2e192-7b26-5145-bf7f-d601355ef745", 00:22:00.927 "is_configured": true, 00:22:00.927 "data_offset": 2048, 00:22:00.927 "data_size": 63488 00:22:00.927 }, 00:22:00.927 { 00:22:00.927 "name": "BaseBdev4", 00:22:00.927 "uuid": "e3286181-655c-52a4-a239-b04df8f398c7", 00:22:00.927 "is_configured": true, 00:22:00.927 "data_offset": 2048, 00:22:00.927 "data_size": 63488 00:22:00.927 } 00:22:00.927 ] 00:22:00.927 }' 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.927 13:40:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.859 13:40:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:01.859 [2024-07-15 13:40:41.154516] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:01.859 [2024-07-15 13:40:41.154554] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:01.859 [2024-07-15 13:40:41.157728] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:01.859 [2024-07-15 13:40:41.157766] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:01.859 [2024-07-15 13:40:41.157886] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:01.859 [2024-07-15 13:40:41.157898] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1954c20 name raid_bdev1, state offline 00:22:01.859 0 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2172003 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2172003 ']' 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2172003 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2172003 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2172003' 00:22:01.859 killing process with pid 2172003 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2172003 00:22:01.859 [2024-07-15 13:40:41.223739] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:01.859 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2172003 00:22:01.859 [2024-07-15 13:40:41.255443] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qDouCfSpGL 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:02.116 00:22:02.116 real 0m7.649s 00:22:02.116 user 0m12.214s 00:22:02.116 sys 0m1.356s 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:02.116 13:40:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.116 ************************************ 00:22:02.116 END TEST raid_read_error_test 00:22:02.116 ************************************ 00:22:02.374 13:40:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:02.374 13:40:41 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:02.374 13:40:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:02.374 13:40:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:02.374 13:40:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:02.374 ************************************ 00:22:02.374 START TEST raid_write_error_test 00:22:02.374 ************************************ 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GRIDqfTn6K 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2173148 00:22:02.374 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2173148 /var/tmp/spdk-raid.sock 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2173148 ']' 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:02.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:02.375 13:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.375 [2024-07-15 13:40:41.667220] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:22:02.375 [2024-07-15 13:40:41.667290] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2173148 ] 00:22:02.375 [2024-07-15 13:40:41.795689] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:02.632 [2024-07-15 13:40:41.903848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:02.632 [2024-07-15 13:40:41.960684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:02.632 [2024-07-15 13:40:41.960715] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:03.235 13:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:03.235 13:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:03.235 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:03.235 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:03.492 BaseBdev1_malloc 00:22:03.492 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:03.750 true 00:22:03.750 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:04.009 [2024-07-15 13:40:43.189299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:04.009 [2024-07-15 13:40:43.189343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.009 [2024-07-15 13:40:43.189365] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd5c0d0 00:22:04.009 [2024-07-15 13:40:43.189378] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.009 [2024-07-15 13:40:43.191165] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.009 [2024-07-15 13:40:43.191195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:04.009 BaseBdev1 00:22:04.009 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:04.009 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:04.009 BaseBdev2_malloc 00:22:04.009 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:04.265 true 00:22:04.265 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:04.523 [2024-07-15 13:40:43.707177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:04.523 [2024-07-15 13:40:43.707217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.523 [2024-07-15 13:40:43.707238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd60910 00:22:04.523 [2024-07-15 13:40:43.707251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.523 [2024-07-15 13:40:43.708653] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.523 [2024-07-15 13:40:43.708681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:04.523 BaseBdev2 00:22:04.523 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:04.523 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:04.523 BaseBdev3_malloc 00:22:04.523 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:04.781 true 00:22:04.781 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:05.038 [2024-07-15 13:40:44.233241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:05.038 [2024-07-15 13:40:44.233284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.038 [2024-07-15 13:40:44.233304] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd62bd0 00:22:05.038 [2024-07-15 13:40:44.233317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.039 [2024-07-15 13:40:44.234730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.039 [2024-07-15 13:40:44.234757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:05.039 BaseBdev3 00:22:05.039 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:05.039 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:05.039 BaseBdev4_malloc 00:22:05.039 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:05.295 true 00:22:05.295 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:05.552 [2024-07-15 13:40:44.759118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:05.552 [2024-07-15 13:40:44.759160] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:05.552 [2024-07-15 13:40:44.759179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd63aa0 00:22:05.552 [2024-07-15 13:40:44.759191] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:05.552 [2024-07-15 13:40:44.760588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:05.552 [2024-07-15 13:40:44.760613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:05.552 BaseBdev4 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:05.552 [2024-07-15 13:40:44.935612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:05.552 [2024-07-15 13:40:44.936805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:05.552 [2024-07-15 13:40:44.936868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:05.552 [2024-07-15 13:40:44.936937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:05.552 [2024-07-15 13:40:44.937168] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd5dc20 00:22:05.552 [2024-07-15 13:40:44.937180] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:05.552 [2024-07-15 13:40:44.937351] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb2260 00:22:05.552 [2024-07-15 13:40:44.937499] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd5dc20 00:22:05.552 [2024-07-15 13:40:44.937509] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd5dc20 00:22:05.552 [2024-07-15 13:40:44.937608] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.552 13:40:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:05.810 13:40:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.810 "name": "raid_bdev1", 00:22:05.810 "uuid": "e4b82328-2c27-4a34-b167-2b7633f7fb1f", 00:22:05.810 "strip_size_kb": 0, 00:22:05.810 "state": "online", 00:22:05.810 "raid_level": "raid1", 00:22:05.810 "superblock": true, 00:22:05.810 "num_base_bdevs": 4, 00:22:05.810 "num_base_bdevs_discovered": 4, 00:22:05.810 "num_base_bdevs_operational": 4, 00:22:05.810 "base_bdevs_list": [ 00:22:05.810 { 00:22:05.810 "name": "BaseBdev1", 00:22:05.810 "uuid": "31868e62-2bf4-585c-bb86-52894803ce23", 00:22:05.810 "is_configured": true, 00:22:05.810 "data_offset": 2048, 00:22:05.810 "data_size": 63488 00:22:05.810 }, 00:22:05.810 { 00:22:05.810 "name": "BaseBdev2", 00:22:05.810 "uuid": "a851677f-7b05-571d-aedc-a3a0e3147347", 00:22:05.810 "is_configured": true, 00:22:05.810 "data_offset": 2048, 00:22:05.810 "data_size": 63488 00:22:05.810 }, 00:22:05.810 { 00:22:05.810 "name": "BaseBdev3", 00:22:05.810 "uuid": "ab760c5e-2650-55d9-b335-1c7f16f96da8", 00:22:05.810 "is_configured": true, 00:22:05.810 "data_offset": 2048, 00:22:05.810 "data_size": 63488 00:22:05.810 }, 00:22:05.810 { 00:22:05.810 "name": "BaseBdev4", 00:22:05.810 "uuid": "56a3a17a-990e-58fc-ae06-7b2e0918d3c9", 00:22:05.810 "is_configured": true, 00:22:05.810 "data_offset": 2048, 00:22:05.810 "data_size": 63488 00:22:05.810 } 00:22:05.810 ] 00:22:05.810 }' 00:22:05.810 13:40:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.810 13:40:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.374 13:40:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:06.374 13:40:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:06.631 [2024-07-15 13:40:45.858352] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb1c60 00:22:07.600 13:40:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:07.600 [2024-07-15 13:40:46.979461] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:07.600 [2024-07-15 13:40:46.979519] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:07.600 [2024-07-15 13:40:46.979737] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xbb1c60 00:22:07.600 13:40:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:07.600 13:40:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:07.600 13:40:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.600 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.857 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.857 "name": "raid_bdev1", 00:22:07.857 "uuid": "e4b82328-2c27-4a34-b167-2b7633f7fb1f", 00:22:07.857 "strip_size_kb": 0, 00:22:07.857 "state": "online", 00:22:07.857 "raid_level": "raid1", 00:22:07.857 "superblock": true, 00:22:07.857 "num_base_bdevs": 4, 00:22:07.857 "num_base_bdevs_discovered": 3, 00:22:07.857 "num_base_bdevs_operational": 3, 00:22:07.857 "base_bdevs_list": [ 00:22:07.857 { 00:22:07.857 "name": null, 00:22:07.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.857 "is_configured": false, 00:22:07.857 "data_offset": 2048, 00:22:07.857 "data_size": 63488 00:22:07.857 }, 00:22:07.857 { 00:22:07.857 "name": "BaseBdev2", 00:22:07.857 "uuid": "a851677f-7b05-571d-aedc-a3a0e3147347", 00:22:07.857 "is_configured": true, 00:22:07.857 "data_offset": 2048, 00:22:07.857 "data_size": 63488 00:22:07.857 }, 00:22:07.857 { 00:22:07.857 "name": "BaseBdev3", 00:22:07.857 "uuid": "ab760c5e-2650-55d9-b335-1c7f16f96da8", 00:22:07.857 "is_configured": true, 00:22:07.857 "data_offset": 2048, 00:22:07.857 "data_size": 63488 00:22:07.857 }, 00:22:07.857 { 00:22:07.857 "name": "BaseBdev4", 00:22:07.857 "uuid": "56a3a17a-990e-58fc-ae06-7b2e0918d3c9", 00:22:07.857 "is_configured": true, 00:22:07.857 "data_offset": 2048, 00:22:07.857 "data_size": 63488 00:22:07.857 } 00:22:07.857 ] 00:22:07.857 }' 00:22:07.857 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.857 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.421 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:08.678 [2024-07-15 13:40:47.940983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:08.678 [2024-07-15 13:40:47.941016] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:08.678 [2024-07-15 13:40:47.944152] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:08.678 [2024-07-15 13:40:47.944187] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:08.678 [2024-07-15 13:40:47.944285] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:08.678 [2024-07-15 13:40:47.944297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5dc20 name raid_bdev1, state offline 00:22:08.678 0 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2173148 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2173148 ']' 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2173148 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:08.678 13:40:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2173148 00:22:08.678 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:08.678 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:08.678 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2173148' 00:22:08.678 killing process with pid 2173148 00:22:08.678 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2173148 00:22:08.678 [2024-07-15 13:40:48.009744] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:08.678 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2173148 00:22:08.678 [2024-07-15 13:40:48.042701] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GRIDqfTn6K 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:08.935 00:22:08.935 real 0m6.703s 00:22:08.935 user 0m10.433s 00:22:08.935 sys 0m1.261s 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:08.935 13:40:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.935 ************************************ 00:22:08.935 END TEST raid_write_error_test 00:22:08.935 ************************************ 00:22:08.935 13:40:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:08.935 13:40:48 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:08.935 13:40:48 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:08.935 13:40:48 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:08.935 13:40:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:08.935 13:40:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:08.935 13:40:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:09.193 ************************************ 00:22:09.193 START TEST raid_rebuild_test 00:22:09.193 ************************************ 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2174131 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2174131 /var/tmp/spdk-raid.sock 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2174131 ']' 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:09.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:09.193 13:40:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:09.193 [2024-07-15 13:40:48.443286] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:22:09.193 [2024-07-15 13:40:48.443358] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2174131 ] 00:22:09.193 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:09.193 Zero copy mechanism will not be used. 00:22:09.193 [2024-07-15 13:40:48.573555] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:09.450 [2024-07-15 13:40:48.676651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:09.450 [2024-07-15 13:40:48.736439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:09.450 [2024-07-15 13:40:48.736483] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:10.012 13:40:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:10.012 13:40:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:10.012 13:40:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:10.012 13:40:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:10.269 BaseBdev1_malloc 00:22:10.269 13:40:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:10.525 [2024-07-15 13:40:49.853271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:10.525 [2024-07-15 13:40:49.853318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.525 [2024-07-15 13:40:49.853345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2158d40 00:22:10.525 [2024-07-15 13:40:49.853357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.525 [2024-07-15 13:40:49.855116] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.525 [2024-07-15 13:40:49.855145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:10.525 BaseBdev1 00:22:10.525 13:40:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:10.525 13:40:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:10.782 BaseBdev2_malloc 00:22:10.782 13:40:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:11.038 [2024-07-15 13:40:50.360258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:11.038 [2024-07-15 13:40:50.360303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.038 [2024-07-15 13:40:50.360327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2159860 00:22:11.038 [2024-07-15 13:40:50.360339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.038 [2024-07-15 13:40:50.361861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.038 [2024-07-15 13:40:50.361889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:11.038 BaseBdev2 00:22:11.038 13:40:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:11.295 spare_malloc 00:22:11.295 13:40:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:11.551 spare_delay 00:22:11.551 13:40:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:11.808 [2024-07-15 13:40:51.094813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:11.808 [2024-07-15 13:40:51.094865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.808 [2024-07-15 13:40:51.094886] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2307ec0 00:22:11.808 [2024-07-15 13:40:51.094899] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.808 [2024-07-15 13:40:51.096502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.808 [2024-07-15 13:40:51.096529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:11.808 spare 00:22:11.808 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:12.065 [2024-07-15 13:40:51.339483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:12.065 [2024-07-15 13:40:51.340822] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:12.065 [2024-07-15 13:40:51.340902] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2309070 00:22:12.065 [2024-07-15 13:40:51.340913] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:12.065 [2024-07-15 13:40:51.341134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2302490 00:22:12.065 [2024-07-15 13:40:51.341278] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2309070 00:22:12.065 [2024-07-15 13:40:51.341288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2309070 00:22:12.065 [2024-07-15 13:40:51.341406] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.066 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.323 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.323 "name": "raid_bdev1", 00:22:12.323 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:12.323 "strip_size_kb": 0, 00:22:12.323 "state": "online", 00:22:12.323 "raid_level": "raid1", 00:22:12.323 "superblock": false, 00:22:12.323 "num_base_bdevs": 2, 00:22:12.323 "num_base_bdevs_discovered": 2, 00:22:12.323 "num_base_bdevs_operational": 2, 00:22:12.323 "base_bdevs_list": [ 00:22:12.323 { 00:22:12.323 "name": "BaseBdev1", 00:22:12.323 "uuid": "476b6d4f-8e97-53d0-9187-e4f86ba1a33a", 00:22:12.323 "is_configured": true, 00:22:12.323 "data_offset": 0, 00:22:12.323 "data_size": 65536 00:22:12.323 }, 00:22:12.323 { 00:22:12.323 "name": "BaseBdev2", 00:22:12.323 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:12.323 "is_configured": true, 00:22:12.323 "data_offset": 0, 00:22:12.323 "data_size": 65536 00:22:12.323 } 00:22:12.323 ] 00:22:12.323 }' 00:22:12.323 13:40:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.323 13:40:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.886 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:12.886 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:13.144 [2024-07-15 13:40:52.418600] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:13.144 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:13.144 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.144 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:13.400 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:13.401 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:13.657 [2024-07-15 13:40:52.911702] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2302490 00:22:13.657 /dev/nbd0 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:13.657 1+0 records in 00:22:13.657 1+0 records out 00:22:13.657 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245486 s, 16.7 MB/s 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:13.657 13:40:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:18.917 65536+0 records in 00:22:18.917 65536+0 records out 00:22:18.917 33554432 bytes (34 MB, 32 MiB) copied, 4.76163 s, 7.0 MB/s 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:18.917 13:40:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:18.917 [2024-07-15 13:40:58.005318] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:18.917 [2024-07-15 13:40:58.246003] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.917 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.176 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.176 "name": "raid_bdev1", 00:22:19.176 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:19.176 "strip_size_kb": 0, 00:22:19.176 "state": "online", 00:22:19.176 "raid_level": "raid1", 00:22:19.176 "superblock": false, 00:22:19.176 "num_base_bdevs": 2, 00:22:19.176 "num_base_bdevs_discovered": 1, 00:22:19.176 "num_base_bdevs_operational": 1, 00:22:19.176 "base_bdevs_list": [ 00:22:19.176 { 00:22:19.176 "name": null, 00:22:19.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.176 "is_configured": false, 00:22:19.176 "data_offset": 0, 00:22:19.176 "data_size": 65536 00:22:19.176 }, 00:22:19.176 { 00:22:19.176 "name": "BaseBdev2", 00:22:19.176 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:19.176 "is_configured": true, 00:22:19.176 "data_offset": 0, 00:22:19.176 "data_size": 65536 00:22:19.176 } 00:22:19.176 ] 00:22:19.176 }' 00:22:19.176 13:40:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.176 13:40:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.743 13:40:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:20.002 [2024-07-15 13:40:59.336900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:20.002 [2024-07-15 13:40:59.341894] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2309880 00:22:20.002 [2024-07-15 13:40:59.344056] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:20.002 13:40:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:21.376 "name": "raid_bdev1", 00:22:21.376 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:21.376 "strip_size_kb": 0, 00:22:21.376 "state": "online", 00:22:21.376 "raid_level": "raid1", 00:22:21.376 "superblock": false, 00:22:21.376 "num_base_bdevs": 2, 00:22:21.376 "num_base_bdevs_discovered": 2, 00:22:21.376 "num_base_bdevs_operational": 2, 00:22:21.376 "process": { 00:22:21.376 "type": "rebuild", 00:22:21.376 "target": "spare", 00:22:21.376 "progress": { 00:22:21.376 "blocks": 24576, 00:22:21.376 "percent": 37 00:22:21.376 } 00:22:21.376 }, 00:22:21.376 "base_bdevs_list": [ 00:22:21.376 { 00:22:21.376 "name": "spare", 00:22:21.376 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:21.376 "is_configured": true, 00:22:21.376 "data_offset": 0, 00:22:21.376 "data_size": 65536 00:22:21.376 }, 00:22:21.376 { 00:22:21.376 "name": "BaseBdev2", 00:22:21.376 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:21.376 "is_configured": true, 00:22:21.376 "data_offset": 0, 00:22:21.376 "data_size": 65536 00:22:21.376 } 00:22:21.376 ] 00:22:21.376 }' 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:21.376 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:21.634 [2024-07-15 13:41:00.854222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:21.634 [2024-07-15 13:41:00.856009] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:21.634 [2024-07-15 13:41:00.856056] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.634 [2024-07-15 13:41:00.856071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:21.634 [2024-07-15 13:41:00.856080] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.634 13:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.891 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.891 "name": "raid_bdev1", 00:22:21.891 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:21.891 "strip_size_kb": 0, 00:22:21.891 "state": "online", 00:22:21.891 "raid_level": "raid1", 00:22:21.891 "superblock": false, 00:22:21.891 "num_base_bdevs": 2, 00:22:21.891 "num_base_bdevs_discovered": 1, 00:22:21.891 "num_base_bdevs_operational": 1, 00:22:21.891 "base_bdevs_list": [ 00:22:21.891 { 00:22:21.891 "name": null, 00:22:21.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.891 "is_configured": false, 00:22:21.891 "data_offset": 0, 00:22:21.891 "data_size": 65536 00:22:21.891 }, 00:22:21.891 { 00:22:21.891 "name": "BaseBdev2", 00:22:21.891 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:21.891 "is_configured": true, 00:22:21.891 "data_offset": 0, 00:22:21.891 "data_size": 65536 00:22:21.891 } 00:22:21.891 ] 00:22:21.891 }' 00:22:21.891 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.891 13:41:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:22.474 "name": "raid_bdev1", 00:22:22.474 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:22.474 "strip_size_kb": 0, 00:22:22.474 "state": "online", 00:22:22.474 "raid_level": "raid1", 00:22:22.474 "superblock": false, 00:22:22.474 "num_base_bdevs": 2, 00:22:22.474 "num_base_bdevs_discovered": 1, 00:22:22.474 "num_base_bdevs_operational": 1, 00:22:22.474 "base_bdevs_list": [ 00:22:22.474 { 00:22:22.474 "name": null, 00:22:22.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.474 "is_configured": false, 00:22:22.474 "data_offset": 0, 00:22:22.474 "data_size": 65536 00:22:22.474 }, 00:22:22.474 { 00:22:22.474 "name": "BaseBdev2", 00:22:22.474 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:22.474 "is_configured": true, 00:22:22.474 "data_offset": 0, 00:22:22.474 "data_size": 65536 00:22:22.474 } 00:22:22.474 ] 00:22:22.474 }' 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:22.474 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:22.732 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:22.732 13:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:22.732 [2024-07-15 13:41:02.143946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:22.732 [2024-07-15 13:41:02.149573] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2302490 00:22:22.732 [2024-07-15 13:41:02.151107] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:22.988 13:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.917 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:24.175 "name": "raid_bdev1", 00:22:24.175 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:24.175 "strip_size_kb": 0, 00:22:24.175 "state": "online", 00:22:24.175 "raid_level": "raid1", 00:22:24.175 "superblock": false, 00:22:24.175 "num_base_bdevs": 2, 00:22:24.175 "num_base_bdevs_discovered": 2, 00:22:24.175 "num_base_bdevs_operational": 2, 00:22:24.175 "process": { 00:22:24.175 "type": "rebuild", 00:22:24.175 "target": "spare", 00:22:24.175 "progress": { 00:22:24.175 "blocks": 24576, 00:22:24.175 "percent": 37 00:22:24.175 } 00:22:24.175 }, 00:22:24.175 "base_bdevs_list": [ 00:22:24.175 { 00:22:24.175 "name": "spare", 00:22:24.175 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:24.175 "is_configured": true, 00:22:24.175 "data_offset": 0, 00:22:24.175 "data_size": 65536 00:22:24.175 }, 00:22:24.175 { 00:22:24.175 "name": "BaseBdev2", 00:22:24.175 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:24.175 "is_configured": true, 00:22:24.175 "data_offset": 0, 00:22:24.175 "data_size": 65536 00:22:24.175 } 00:22:24.175 ] 00:22:24.175 }' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=767 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.175 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:24.432 "name": "raid_bdev1", 00:22:24.432 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:24.432 "strip_size_kb": 0, 00:22:24.432 "state": "online", 00:22:24.432 "raid_level": "raid1", 00:22:24.432 "superblock": false, 00:22:24.432 "num_base_bdevs": 2, 00:22:24.432 "num_base_bdevs_discovered": 2, 00:22:24.432 "num_base_bdevs_operational": 2, 00:22:24.432 "process": { 00:22:24.432 "type": "rebuild", 00:22:24.432 "target": "spare", 00:22:24.432 "progress": { 00:22:24.432 "blocks": 30720, 00:22:24.432 "percent": 46 00:22:24.432 } 00:22:24.432 }, 00:22:24.432 "base_bdevs_list": [ 00:22:24.432 { 00:22:24.432 "name": "spare", 00:22:24.432 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:24.432 "is_configured": true, 00:22:24.432 "data_offset": 0, 00:22:24.432 "data_size": 65536 00:22:24.432 }, 00:22:24.432 { 00:22:24.432 "name": "BaseBdev2", 00:22:24.432 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:24.432 "is_configured": true, 00:22:24.432 "data_offset": 0, 00:22:24.432 "data_size": 65536 00:22:24.432 } 00:22:24.432 ] 00:22:24.432 }' 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:24.432 13:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.807 13:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.807 "name": "raid_bdev1", 00:22:25.807 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:25.807 "strip_size_kb": 0, 00:22:25.807 "state": "online", 00:22:25.807 "raid_level": "raid1", 00:22:25.807 "superblock": false, 00:22:25.807 "num_base_bdevs": 2, 00:22:25.807 "num_base_bdevs_discovered": 2, 00:22:25.807 "num_base_bdevs_operational": 2, 00:22:25.807 "process": { 00:22:25.807 "type": "rebuild", 00:22:25.807 "target": "spare", 00:22:25.807 "progress": { 00:22:25.807 "blocks": 59392, 00:22:25.807 "percent": 90 00:22:25.807 } 00:22:25.807 }, 00:22:25.807 "base_bdevs_list": [ 00:22:25.807 { 00:22:25.807 "name": "spare", 00:22:25.807 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:25.807 "is_configured": true, 00:22:25.807 "data_offset": 0, 00:22:25.807 "data_size": 65536 00:22:25.807 }, 00:22:25.807 { 00:22:25.807 "name": "BaseBdev2", 00:22:25.807 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:25.807 "is_configured": true, 00:22:25.807 "data_offset": 0, 00:22:25.807 "data_size": 65536 00:22:25.807 } 00:22:25.807 ] 00:22:25.807 }' 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.807 13:41:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:26.065 [2024-07-15 13:41:05.376133] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:26.065 [2024-07-15 13:41:05.376193] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:26.065 [2024-07-15 13:41:05.376228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.000 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:27.259 "name": "raid_bdev1", 00:22:27.259 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:27.259 "strip_size_kb": 0, 00:22:27.259 "state": "online", 00:22:27.259 "raid_level": "raid1", 00:22:27.259 "superblock": false, 00:22:27.259 "num_base_bdevs": 2, 00:22:27.259 "num_base_bdevs_discovered": 2, 00:22:27.259 "num_base_bdevs_operational": 2, 00:22:27.259 "base_bdevs_list": [ 00:22:27.259 { 00:22:27.259 "name": "spare", 00:22:27.259 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:27.259 "is_configured": true, 00:22:27.259 "data_offset": 0, 00:22:27.259 "data_size": 65536 00:22:27.259 }, 00:22:27.259 { 00:22:27.259 "name": "BaseBdev2", 00:22:27.259 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:27.259 "is_configured": true, 00:22:27.259 "data_offset": 0, 00:22:27.259 "data_size": 65536 00:22:27.259 } 00:22:27.259 ] 00:22:27.259 }' 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.259 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:27.533 "name": "raid_bdev1", 00:22:27.533 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:27.533 "strip_size_kb": 0, 00:22:27.533 "state": "online", 00:22:27.533 "raid_level": "raid1", 00:22:27.533 "superblock": false, 00:22:27.533 "num_base_bdevs": 2, 00:22:27.533 "num_base_bdevs_discovered": 2, 00:22:27.533 "num_base_bdevs_operational": 2, 00:22:27.533 "base_bdevs_list": [ 00:22:27.533 { 00:22:27.533 "name": "spare", 00:22:27.533 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:27.533 "is_configured": true, 00:22:27.533 "data_offset": 0, 00:22:27.533 "data_size": 65536 00:22:27.533 }, 00:22:27.533 { 00:22:27.533 "name": "BaseBdev2", 00:22:27.533 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:27.533 "is_configured": true, 00:22:27.533 "data_offset": 0, 00:22:27.533 "data_size": 65536 00:22:27.533 } 00:22:27.533 ] 00:22:27.533 }' 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.533 13:41:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.822 13:41:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.822 "name": "raid_bdev1", 00:22:27.822 "uuid": "ed328526-a23c-46f8-9db7-90b2387ab7ee", 00:22:27.822 "strip_size_kb": 0, 00:22:27.822 "state": "online", 00:22:27.822 "raid_level": "raid1", 00:22:27.823 "superblock": false, 00:22:27.823 "num_base_bdevs": 2, 00:22:27.823 "num_base_bdevs_discovered": 2, 00:22:27.823 "num_base_bdevs_operational": 2, 00:22:27.823 "base_bdevs_list": [ 00:22:27.823 { 00:22:27.823 "name": "spare", 00:22:27.823 "uuid": "dd143706-4006-59ce-b6d8-56cdb1cd49de", 00:22:27.823 "is_configured": true, 00:22:27.823 "data_offset": 0, 00:22:27.823 "data_size": 65536 00:22:27.823 }, 00:22:27.823 { 00:22:27.823 "name": "BaseBdev2", 00:22:27.823 "uuid": "173687a8-595e-54ef-a56b-d31574fe192f", 00:22:27.823 "is_configured": true, 00:22:27.823 "data_offset": 0, 00:22:27.823 "data_size": 65536 00:22:27.823 } 00:22:27.823 ] 00:22:27.823 }' 00:22:27.823 13:41:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.823 13:41:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.391 13:41:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:28.650 [2024-07-15 13:41:07.928111] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:28.650 [2024-07-15 13:41:07.928139] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:28.650 [2024-07-15 13:41:07.928199] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:28.650 [2024-07-15 13:41:07.928256] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:28.650 [2024-07-15 13:41:07.928268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2309070 name raid_bdev1, state offline 00:22:28.650 13:41:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:28.650 13:41:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:28.910 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:29.169 /dev/nbd0 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.170 1+0 records in 00:22:29.170 1+0 records out 00:22:29.170 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234267 s, 17.5 MB/s 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.170 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:29.429 /dev/nbd1 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:29.429 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.430 1+0 records in 00:22:29.430 1+0 records out 00:22:29.430 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000350292 s, 11.7 MB/s 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.430 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:29.688 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:29.689 13:41:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:29.947 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:30.205 13:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2174131 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2174131 ']' 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2174131 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2174131 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2174131' 00:22:30.206 killing process with pid 2174131 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2174131 00:22:30.206 Received shutdown signal, test time was about 60.000000 seconds 00:22:30.206 00:22:30.206 Latency(us) 00:22:30.206 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:30.206 =================================================================================================================== 00:22:30.206 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:30.206 [2024-07-15 13:41:09.487186] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:30.206 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2174131 00:22:30.206 [2024-07-15 13:41:09.515814] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:30.464 00:22:30.464 real 0m21.368s 00:22:30.464 user 0m29.176s 00:22:30.464 sys 0m4.541s 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.464 ************************************ 00:22:30.464 END TEST raid_rebuild_test 00:22:30.464 ************************************ 00:22:30.464 13:41:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:30.464 13:41:09 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:22:30.464 13:41:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:30.464 13:41:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:30.464 13:41:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:30.464 ************************************ 00:22:30.464 START TEST raid_rebuild_test_sb 00:22:30.464 ************************************ 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2177103 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2177103 /var/tmp/spdk-raid.sock 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2177103 ']' 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:30.464 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:30.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:30.465 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:30.465 13:41:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.724 [2024-07-15 13:41:09.903221] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:22:30.724 [2024-07-15 13:41:09.903290] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2177103 ] 00:22:30.724 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:30.724 Zero copy mechanism will not be used. 00:22:30.724 [2024-07-15 13:41:10.032750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:30.724 [2024-07-15 13:41:10.142311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:30.981 [2024-07-15 13:41:10.199264] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.981 [2024-07-15 13:41:10.199296] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:31.547 13:41:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:31.547 13:41:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:31.547 13:41:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:31.547 13:41:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:31.806 BaseBdev1_malloc 00:22:31.806 13:41:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:32.064 [2024-07-15 13:41:11.323444] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:32.064 [2024-07-15 13:41:11.323493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.064 [2024-07-15 13:41:11.323515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x255fd40 00:22:32.064 [2024-07-15 13:41:11.323527] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.064 [2024-07-15 13:41:11.325120] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.064 [2024-07-15 13:41:11.325149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:32.064 BaseBdev1 00:22:32.064 13:41:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:32.064 13:41:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:32.322 BaseBdev2_malloc 00:22:32.322 13:41:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:32.580 [2024-07-15 13:41:11.817476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:32.580 [2024-07-15 13:41:11.817520] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.580 [2024-07-15 13:41:11.817543] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2560860 00:22:32.580 [2024-07-15 13:41:11.817555] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.580 [2024-07-15 13:41:11.818911] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.580 [2024-07-15 13:41:11.818945] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:32.580 BaseBdev2 00:22:32.580 13:41:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:32.839 spare_malloc 00:22:32.839 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:33.098 spare_delay 00:22:33.098 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:33.357 [2024-07-15 13:41:12.560120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:33.357 [2024-07-15 13:41:12.560171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:33.357 [2024-07-15 13:41:12.560198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x270eec0 00:22:33.357 [2024-07-15 13:41:12.560210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:33.357 [2024-07-15 13:41:12.561703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:33.357 [2024-07-15 13:41:12.561732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:33.357 spare 00:22:33.357 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:33.619 [2024-07-15 13:41:12.808787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.619 [2024-07-15 13:41:12.810030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:33.619 [2024-07-15 13:41:12.810194] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2710070 00:22:33.619 [2024-07-15 13:41:12.810207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:33.619 [2024-07-15 13:41:12.810402] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2709490 00:22:33.619 [2024-07-15 13:41:12.810543] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2710070 00:22:33.619 [2024-07-15 13:41:12.810553] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2710070 00:22:33.620 [2024-07-15 13:41:12.810647] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.620 13:41:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.879 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.879 "name": "raid_bdev1", 00:22:33.879 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:33.879 "strip_size_kb": 0, 00:22:33.879 "state": "online", 00:22:33.879 "raid_level": "raid1", 00:22:33.879 "superblock": true, 00:22:33.879 "num_base_bdevs": 2, 00:22:33.879 "num_base_bdevs_discovered": 2, 00:22:33.879 "num_base_bdevs_operational": 2, 00:22:33.879 "base_bdevs_list": [ 00:22:33.879 { 00:22:33.879 "name": "BaseBdev1", 00:22:33.879 "uuid": "2195345e-005b-545e-8784-61c2ff372ab5", 00:22:33.879 "is_configured": true, 00:22:33.879 "data_offset": 2048, 00:22:33.879 "data_size": 63488 00:22:33.879 }, 00:22:33.879 { 00:22:33.879 "name": "BaseBdev2", 00:22:33.879 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:33.879 "is_configured": true, 00:22:33.879 "data_offset": 2048, 00:22:33.879 "data_size": 63488 00:22:33.879 } 00:22:33.879 ] 00:22:33.879 }' 00:22:33.879 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.879 13:41:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.446 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:34.446 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:34.705 [2024-07-15 13:41:13.891883] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:34.705 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:34.705 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.705 13:41:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.963 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:35.221 [2024-07-15 13:41:14.393016] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2709490 00:22:35.221 /dev/nbd0 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:35.221 1+0 records in 00:22:35.221 1+0 records out 00:22:35.221 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241116 s, 17.0 MB/s 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:35.221 13:41:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:41.784 63488+0 records in 00:22:41.784 63488+0 records out 00:22:41.784 32505856 bytes (33 MB, 31 MiB) copied, 5.63249 s, 5.8 MB/s 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:41.784 [2024-07-15 13:41:20.357344] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:41.784 [2024-07-15 13:41:20.590022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.784 "name": "raid_bdev1", 00:22:41.784 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:41.784 "strip_size_kb": 0, 00:22:41.784 "state": "online", 00:22:41.784 "raid_level": "raid1", 00:22:41.784 "superblock": true, 00:22:41.784 "num_base_bdevs": 2, 00:22:41.784 "num_base_bdevs_discovered": 1, 00:22:41.784 "num_base_bdevs_operational": 1, 00:22:41.784 "base_bdevs_list": [ 00:22:41.784 { 00:22:41.784 "name": null, 00:22:41.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.784 "is_configured": false, 00:22:41.784 "data_offset": 2048, 00:22:41.784 "data_size": 63488 00:22:41.784 }, 00:22:41.784 { 00:22:41.784 "name": "BaseBdev2", 00:22:41.784 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:41.784 "is_configured": true, 00:22:41.784 "data_offset": 2048, 00:22:41.784 "data_size": 63488 00:22:41.784 } 00:22:41.784 ] 00:22:41.784 }' 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.784 13:41:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.042 13:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:42.300 [2024-07-15 13:41:21.604731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:42.300 [2024-07-15 13:41:21.609694] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x270fce0 00:22:42.300 [2024-07-15 13:41:21.611906] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:42.300 13:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.234 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.492 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.492 "name": "raid_bdev1", 00:22:43.492 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:43.492 "strip_size_kb": 0, 00:22:43.492 "state": "online", 00:22:43.492 "raid_level": "raid1", 00:22:43.492 "superblock": true, 00:22:43.492 "num_base_bdevs": 2, 00:22:43.492 "num_base_bdevs_discovered": 2, 00:22:43.492 "num_base_bdevs_operational": 2, 00:22:43.492 "process": { 00:22:43.492 "type": "rebuild", 00:22:43.492 "target": "spare", 00:22:43.492 "progress": { 00:22:43.492 "blocks": 24576, 00:22:43.492 "percent": 38 00:22:43.492 } 00:22:43.492 }, 00:22:43.492 "base_bdevs_list": [ 00:22:43.492 { 00:22:43.492 "name": "spare", 00:22:43.492 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:43.492 "is_configured": true, 00:22:43.492 "data_offset": 2048, 00:22:43.492 "data_size": 63488 00:22:43.492 }, 00:22:43.492 { 00:22:43.492 "name": "BaseBdev2", 00:22:43.492 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:43.492 "is_configured": true, 00:22:43.492 "data_offset": 2048, 00:22:43.492 "data_size": 63488 00:22:43.492 } 00:22:43.492 ] 00:22:43.492 }' 00:22:43.492 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.750 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:43.750 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.750 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:43.750 13:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:44.012 [2024-07-15 13:41:23.194697] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:44.012 [2024-07-15 13:41:23.224609] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:44.012 [2024-07-15 13:41:23.224655] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:44.012 [2024-07-15 13:41:23.224671] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:44.012 [2024-07-15 13:41:23.224680] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.012 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.270 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.270 "name": "raid_bdev1", 00:22:44.270 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:44.270 "strip_size_kb": 0, 00:22:44.270 "state": "online", 00:22:44.270 "raid_level": "raid1", 00:22:44.270 "superblock": true, 00:22:44.270 "num_base_bdevs": 2, 00:22:44.270 "num_base_bdevs_discovered": 1, 00:22:44.270 "num_base_bdevs_operational": 1, 00:22:44.270 "base_bdevs_list": [ 00:22:44.270 { 00:22:44.270 "name": null, 00:22:44.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.270 "is_configured": false, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 }, 00:22:44.270 { 00:22:44.270 "name": "BaseBdev2", 00:22:44.270 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 } 00:22:44.270 ] 00:22:44.270 }' 00:22:44.270 13:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.270 13:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.837 "name": "raid_bdev1", 00:22:44.837 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:44.837 "strip_size_kb": 0, 00:22:44.837 "state": "online", 00:22:44.837 "raid_level": "raid1", 00:22:44.837 "superblock": true, 00:22:44.837 "num_base_bdevs": 2, 00:22:44.837 "num_base_bdevs_discovered": 1, 00:22:44.837 "num_base_bdevs_operational": 1, 00:22:44.837 "base_bdevs_list": [ 00:22:44.837 { 00:22:44.837 "name": null, 00:22:44.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.837 "is_configured": false, 00:22:44.837 "data_offset": 2048, 00:22:44.837 "data_size": 63488 00:22:44.837 }, 00:22:44.837 { 00:22:44.837 "name": "BaseBdev2", 00:22:44.837 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:44.837 "is_configured": true, 00:22:44.837 "data_offset": 2048, 00:22:44.837 "data_size": 63488 00:22:44.837 } 00:22:44.837 ] 00:22:44.837 }' 00:22:44.837 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.096 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:45.096 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.096 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:45.096 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:45.096 [2024-07-15 13:41:24.504492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:45.096 [2024-07-15 13:41:24.509490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x270fce0 00:22:45.096 [2024-07-15 13:41:24.510956] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:45.354 13:41:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.287 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.545 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.545 "name": "raid_bdev1", 00:22:46.545 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:46.545 "strip_size_kb": 0, 00:22:46.545 "state": "online", 00:22:46.545 "raid_level": "raid1", 00:22:46.545 "superblock": true, 00:22:46.545 "num_base_bdevs": 2, 00:22:46.545 "num_base_bdevs_discovered": 2, 00:22:46.545 "num_base_bdevs_operational": 2, 00:22:46.545 "process": { 00:22:46.545 "type": "rebuild", 00:22:46.545 "target": "spare", 00:22:46.545 "progress": { 00:22:46.546 "blocks": 24576, 00:22:46.546 "percent": 38 00:22:46.546 } 00:22:46.546 }, 00:22:46.546 "base_bdevs_list": [ 00:22:46.546 { 00:22:46.546 "name": "spare", 00:22:46.546 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:46.546 "is_configured": true, 00:22:46.546 "data_offset": 2048, 00:22:46.546 "data_size": 63488 00:22:46.546 }, 00:22:46.546 { 00:22:46.546 "name": "BaseBdev2", 00:22:46.546 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:46.546 "is_configured": true, 00:22:46.546 "data_offset": 2048, 00:22:46.546 "data_size": 63488 00:22:46.546 } 00:22:46.546 ] 00:22:46.546 }' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:46.546 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=789 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.546 13:41:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.804 "name": "raid_bdev1", 00:22:46.804 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:46.804 "strip_size_kb": 0, 00:22:46.804 "state": "online", 00:22:46.804 "raid_level": "raid1", 00:22:46.804 "superblock": true, 00:22:46.804 "num_base_bdevs": 2, 00:22:46.804 "num_base_bdevs_discovered": 2, 00:22:46.804 "num_base_bdevs_operational": 2, 00:22:46.804 "process": { 00:22:46.804 "type": "rebuild", 00:22:46.804 "target": "spare", 00:22:46.804 "progress": { 00:22:46.804 "blocks": 30720, 00:22:46.804 "percent": 48 00:22:46.804 } 00:22:46.804 }, 00:22:46.804 "base_bdevs_list": [ 00:22:46.804 { 00:22:46.804 "name": "spare", 00:22:46.804 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:46.804 "is_configured": true, 00:22:46.804 "data_offset": 2048, 00:22:46.804 "data_size": 63488 00:22:46.804 }, 00:22:46.804 { 00:22:46.804 "name": "BaseBdev2", 00:22:46.804 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:46.804 "is_configured": true, 00:22:46.804 "data_offset": 2048, 00:22:46.804 "data_size": 63488 00:22:46.804 } 00:22:46.804 ] 00:22:46.804 }' 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.804 13:41:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.180 "name": "raid_bdev1", 00:22:48.180 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:48.180 "strip_size_kb": 0, 00:22:48.180 "state": "online", 00:22:48.180 "raid_level": "raid1", 00:22:48.180 "superblock": true, 00:22:48.180 "num_base_bdevs": 2, 00:22:48.180 "num_base_bdevs_discovered": 2, 00:22:48.180 "num_base_bdevs_operational": 2, 00:22:48.180 "process": { 00:22:48.180 "type": "rebuild", 00:22:48.180 "target": "spare", 00:22:48.180 "progress": { 00:22:48.180 "blocks": 59392, 00:22:48.180 "percent": 93 00:22:48.180 } 00:22:48.180 }, 00:22:48.180 "base_bdevs_list": [ 00:22:48.180 { 00:22:48.180 "name": "spare", 00:22:48.180 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:48.180 "is_configured": true, 00:22:48.180 "data_offset": 2048, 00:22:48.180 "data_size": 63488 00:22:48.180 }, 00:22:48.180 { 00:22:48.180 "name": "BaseBdev2", 00:22:48.180 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:48.180 "is_configured": true, 00:22:48.180 "data_offset": 2048, 00:22:48.180 "data_size": 63488 00:22:48.180 } 00:22:48.180 ] 00:22:48.180 }' 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:48.180 13:41:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:48.440 [2024-07-15 13:41:27.635154] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:48.440 [2024-07-15 13:41:27.635214] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:48.440 [2024-07-15 13:41:27.635294] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.409 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.410 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.410 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.410 "name": "raid_bdev1", 00:22:49.410 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:49.410 "strip_size_kb": 0, 00:22:49.410 "state": "online", 00:22:49.410 "raid_level": "raid1", 00:22:49.410 "superblock": true, 00:22:49.410 "num_base_bdevs": 2, 00:22:49.410 "num_base_bdevs_discovered": 2, 00:22:49.410 "num_base_bdevs_operational": 2, 00:22:49.410 "base_bdevs_list": [ 00:22:49.410 { 00:22:49.410 "name": "spare", 00:22:49.410 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:49.410 "is_configured": true, 00:22:49.410 "data_offset": 2048, 00:22:49.410 "data_size": 63488 00:22:49.410 }, 00:22:49.410 { 00:22:49.410 "name": "BaseBdev2", 00:22:49.410 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:49.410 "is_configured": true, 00:22:49.410 "data_offset": 2048, 00:22:49.410 "data_size": 63488 00:22:49.410 } 00:22:49.410 ] 00:22:49.410 }' 00:22:49.410 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.668 13:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.926 "name": "raid_bdev1", 00:22:49.926 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:49.926 "strip_size_kb": 0, 00:22:49.926 "state": "online", 00:22:49.926 "raid_level": "raid1", 00:22:49.926 "superblock": true, 00:22:49.926 "num_base_bdevs": 2, 00:22:49.926 "num_base_bdevs_discovered": 2, 00:22:49.926 "num_base_bdevs_operational": 2, 00:22:49.926 "base_bdevs_list": [ 00:22:49.926 { 00:22:49.926 "name": "spare", 00:22:49.926 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:49.926 "is_configured": true, 00:22:49.926 "data_offset": 2048, 00:22:49.926 "data_size": 63488 00:22:49.926 }, 00:22:49.926 { 00:22:49.926 "name": "BaseBdev2", 00:22:49.926 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:49.926 "is_configured": true, 00:22:49.926 "data_offset": 2048, 00:22:49.926 "data_size": 63488 00:22:49.926 } 00:22:49.926 ] 00:22:49.926 }' 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.926 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.183 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.183 "name": "raid_bdev1", 00:22:50.183 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:50.183 "strip_size_kb": 0, 00:22:50.183 "state": "online", 00:22:50.183 "raid_level": "raid1", 00:22:50.183 "superblock": true, 00:22:50.183 "num_base_bdevs": 2, 00:22:50.183 "num_base_bdevs_discovered": 2, 00:22:50.183 "num_base_bdevs_operational": 2, 00:22:50.183 "base_bdevs_list": [ 00:22:50.183 { 00:22:50.183 "name": "spare", 00:22:50.183 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:50.183 "is_configured": true, 00:22:50.183 "data_offset": 2048, 00:22:50.183 "data_size": 63488 00:22:50.183 }, 00:22:50.183 { 00:22:50.183 "name": "BaseBdev2", 00:22:50.183 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:50.183 "is_configured": true, 00:22:50.183 "data_offset": 2048, 00:22:50.183 "data_size": 63488 00:22:50.183 } 00:22:50.183 ] 00:22:50.183 }' 00:22:50.183 13:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.183 13:41:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:50.748 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:51.006 [2024-07-15 13:41:30.310946] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:51.006 [2024-07-15 13:41:30.310979] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:51.006 [2024-07-15 13:41:30.311045] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:51.006 [2024-07-15 13:41:30.311103] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:51.006 [2024-07-15 13:41:30.311116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2710070 name raid_bdev1, state offline 00:22:51.006 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.006 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:22:51.263 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:51.263 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.264 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:51.521 /dev/nbd0 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.521 1+0 records in 00:22:51.521 1+0 records out 00:22:51.521 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233995 s, 17.5 MB/s 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.521 13:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:51.778 /dev/nbd1 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.778 1+0 records in 00:22:51.778 1+0 records out 00:22:51.778 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303883 s, 13.5 MB/s 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.778 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:52.036 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:52.293 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:52.552 13:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:52.810 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:53.068 [2024-07-15 13:41:32.276330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:53.068 [2024-07-15 13:41:32.276381] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.068 [2024-07-15 13:41:32.276406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x270f500 00:22:53.068 [2024-07-15 13:41:32.276418] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.068 [2024-07-15 13:41:32.278078] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.068 [2024-07-15 13:41:32.278106] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:53.068 [2024-07-15 13:41:32.278188] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:53.068 [2024-07-15 13:41:32.278216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:53.068 [2024-07-15 13:41:32.278316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:53.068 spare 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.068 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.068 [2024-07-15 13:41:32.378632] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x270e260 00:22:53.068 [2024-07-15 13:41:32.378649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:53.068 [2024-07-15 13:41:32.378852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2709490 00:22:53.068 [2024-07-15 13:41:32.379007] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x270e260 00:22:53.068 [2024-07-15 13:41:32.379018] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x270e260 00:22:53.068 [2024-07-15 13:41:32.379122] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.327 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.327 "name": "raid_bdev1", 00:22:53.327 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:53.327 "strip_size_kb": 0, 00:22:53.327 "state": "online", 00:22:53.327 "raid_level": "raid1", 00:22:53.327 "superblock": true, 00:22:53.327 "num_base_bdevs": 2, 00:22:53.327 "num_base_bdevs_discovered": 2, 00:22:53.327 "num_base_bdevs_operational": 2, 00:22:53.327 "base_bdevs_list": [ 00:22:53.327 { 00:22:53.327 "name": "spare", 00:22:53.327 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:53.327 "is_configured": true, 00:22:53.327 "data_offset": 2048, 00:22:53.327 "data_size": 63488 00:22:53.327 }, 00:22:53.327 { 00:22:53.327 "name": "BaseBdev2", 00:22:53.327 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:53.327 "is_configured": true, 00:22:53.327 "data_offset": 2048, 00:22:53.327 "data_size": 63488 00:22:53.327 } 00:22:53.327 ] 00:22:53.327 }' 00:22:53.327 13:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.327 13:41:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.905 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.166 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.166 "name": "raid_bdev1", 00:22:54.166 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:54.166 "strip_size_kb": 0, 00:22:54.166 "state": "online", 00:22:54.166 "raid_level": "raid1", 00:22:54.166 "superblock": true, 00:22:54.166 "num_base_bdevs": 2, 00:22:54.166 "num_base_bdevs_discovered": 2, 00:22:54.166 "num_base_bdevs_operational": 2, 00:22:54.166 "base_bdevs_list": [ 00:22:54.166 { 00:22:54.166 "name": "spare", 00:22:54.166 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:54.166 "is_configured": true, 00:22:54.166 "data_offset": 2048, 00:22:54.166 "data_size": 63488 00:22:54.166 }, 00:22:54.166 { 00:22:54.166 "name": "BaseBdev2", 00:22:54.167 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:54.167 "is_configured": true, 00:22:54.167 "data_offset": 2048, 00:22:54.167 "data_size": 63488 00:22:54.167 } 00:22:54.167 ] 00:22:54.167 }' 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.167 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:54.425 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.425 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:54.683 [2024-07-15 13:41:33.912806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.683 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.684 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.684 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.684 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.684 13:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.251 13:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.251 "name": "raid_bdev1", 00:22:55.251 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:55.251 "strip_size_kb": 0, 00:22:55.251 "state": "online", 00:22:55.251 "raid_level": "raid1", 00:22:55.251 "superblock": true, 00:22:55.251 "num_base_bdevs": 2, 00:22:55.251 "num_base_bdevs_discovered": 1, 00:22:55.251 "num_base_bdevs_operational": 1, 00:22:55.251 "base_bdevs_list": [ 00:22:55.251 { 00:22:55.251 "name": null, 00:22:55.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.251 "is_configured": false, 00:22:55.251 "data_offset": 2048, 00:22:55.251 "data_size": 63488 00:22:55.251 }, 00:22:55.251 { 00:22:55.251 "name": "BaseBdev2", 00:22:55.251 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:55.251 "is_configured": true, 00:22:55.251 "data_offset": 2048, 00:22:55.251 "data_size": 63488 00:22:55.251 } 00:22:55.251 ] 00:22:55.251 }' 00:22:55.251 13:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.251 13:41:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:55.817 13:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:56.075 [2024-07-15 13:41:35.264400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:56.075 [2024-07-15 13:41:35.264561] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:56.075 [2024-07-15 13:41:35.264579] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:56.075 [2024-07-15 13:41:35.264614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:56.075 [2024-07-15 13:41:35.269405] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2709490 00:22:56.075 [2024-07-15 13:41:35.271734] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:56.075 13:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.010 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.268 "name": "raid_bdev1", 00:22:57.268 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:57.268 "strip_size_kb": 0, 00:22:57.268 "state": "online", 00:22:57.268 "raid_level": "raid1", 00:22:57.268 "superblock": true, 00:22:57.268 "num_base_bdevs": 2, 00:22:57.268 "num_base_bdevs_discovered": 2, 00:22:57.268 "num_base_bdevs_operational": 2, 00:22:57.268 "process": { 00:22:57.268 "type": "rebuild", 00:22:57.268 "target": "spare", 00:22:57.268 "progress": { 00:22:57.268 "blocks": 24576, 00:22:57.268 "percent": 38 00:22:57.268 } 00:22:57.268 }, 00:22:57.268 "base_bdevs_list": [ 00:22:57.268 { 00:22:57.268 "name": "spare", 00:22:57.268 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:57.268 "is_configured": true, 00:22:57.268 "data_offset": 2048, 00:22:57.268 "data_size": 63488 00:22:57.268 }, 00:22:57.268 { 00:22:57.268 "name": "BaseBdev2", 00:22:57.268 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:57.268 "is_configured": true, 00:22:57.268 "data_offset": 2048, 00:22:57.268 "data_size": 63488 00:22:57.268 } 00:22:57.268 ] 00:22:57.268 }' 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.268 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:57.527 [2024-07-15 13:41:36.861955] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:57.527 [2024-07-15 13:41:36.884272] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:57.527 [2024-07-15 13:41:36.884313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:57.527 [2024-07-15 13:41:36.884328] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:57.527 [2024-07-15 13:41:36.884336] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.527 13:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.785 13:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.785 "name": "raid_bdev1", 00:22:57.785 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:57.785 "strip_size_kb": 0, 00:22:57.785 "state": "online", 00:22:57.785 "raid_level": "raid1", 00:22:57.785 "superblock": true, 00:22:57.785 "num_base_bdevs": 2, 00:22:57.785 "num_base_bdevs_discovered": 1, 00:22:57.785 "num_base_bdevs_operational": 1, 00:22:57.785 "base_bdevs_list": [ 00:22:57.785 { 00:22:57.785 "name": null, 00:22:57.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.785 "is_configured": false, 00:22:57.785 "data_offset": 2048, 00:22:57.785 "data_size": 63488 00:22:57.785 }, 00:22:57.785 { 00:22:57.785 "name": "BaseBdev2", 00:22:57.785 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:57.785 "is_configured": true, 00:22:57.785 "data_offset": 2048, 00:22:57.785 "data_size": 63488 00:22:57.785 } 00:22:57.785 ] 00:22:57.785 }' 00:22:57.785 13:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.785 13:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:58.351 13:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:58.609 [2024-07-15 13:41:37.951558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:58.609 [2024-07-15 13:41:37.951618] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:58.609 [2024-07-15 13:41:37.951644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x270f730 00:22:58.609 [2024-07-15 13:41:37.951657] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:58.609 [2024-07-15 13:41:37.952070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:58.609 [2024-07-15 13:41:37.952090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:58.609 [2024-07-15 13:41:37.952178] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:58.609 [2024-07-15 13:41:37.952191] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:58.609 [2024-07-15 13:41:37.952203] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:58.609 [2024-07-15 13:41:37.952223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:58.609 [2024-07-15 13:41:37.957774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2710aa0 00:22:58.609 spare 00:22:58.609 [2024-07-15 13:41:37.959300] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:58.609 13:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:59.983 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.983 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.983 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.983 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.984 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.984 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.984 13:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.984 "name": "raid_bdev1", 00:22:59.984 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:22:59.984 "strip_size_kb": 0, 00:22:59.984 "state": "online", 00:22:59.984 "raid_level": "raid1", 00:22:59.984 "superblock": true, 00:22:59.984 "num_base_bdevs": 2, 00:22:59.984 "num_base_bdevs_discovered": 2, 00:22:59.984 "num_base_bdevs_operational": 2, 00:22:59.984 "process": { 00:22:59.984 "type": "rebuild", 00:22:59.984 "target": "spare", 00:22:59.984 "progress": { 00:22:59.984 "blocks": 24576, 00:22:59.984 "percent": 38 00:22:59.984 } 00:22:59.984 }, 00:22:59.984 "base_bdevs_list": [ 00:22:59.984 { 00:22:59.984 "name": "spare", 00:22:59.984 "uuid": "261554e6-a812-5a67-ba19-43c29069f06a", 00:22:59.984 "is_configured": true, 00:22:59.984 "data_offset": 2048, 00:22:59.984 "data_size": 63488 00:22:59.984 }, 00:22:59.984 { 00:22:59.984 "name": "BaseBdev2", 00:22:59.984 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:22:59.984 "is_configured": true, 00:22:59.984 "data_offset": 2048, 00:22:59.984 "data_size": 63488 00:22:59.984 } 00:22:59.984 ] 00:22:59.984 }' 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:59.984 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:00.243 [2024-07-15 13:41:39.538474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:00.243 [2024-07-15 13:41:39.571882] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:00.243 [2024-07-15 13:41:39.571930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.243 [2024-07-15 13:41:39.571947] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:00.243 [2024-07-15 13:41:39.571956] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.243 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.500 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.500 "name": "raid_bdev1", 00:23:00.500 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:00.500 "strip_size_kb": 0, 00:23:00.500 "state": "online", 00:23:00.500 "raid_level": "raid1", 00:23:00.500 "superblock": true, 00:23:00.500 "num_base_bdevs": 2, 00:23:00.500 "num_base_bdevs_discovered": 1, 00:23:00.500 "num_base_bdevs_operational": 1, 00:23:00.500 "base_bdevs_list": [ 00:23:00.500 { 00:23:00.500 "name": null, 00:23:00.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.500 "is_configured": false, 00:23:00.500 "data_offset": 2048, 00:23:00.500 "data_size": 63488 00:23:00.500 }, 00:23:00.500 { 00:23:00.500 "name": "BaseBdev2", 00:23:00.500 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:00.500 "is_configured": true, 00:23:00.500 "data_offset": 2048, 00:23:00.500 "data_size": 63488 00:23:00.500 } 00:23:00.500 ] 00:23:00.500 }' 00:23:00.500 13:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.500 13:41:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.062 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.319 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.319 "name": "raid_bdev1", 00:23:01.319 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:01.319 "strip_size_kb": 0, 00:23:01.319 "state": "online", 00:23:01.319 "raid_level": "raid1", 00:23:01.319 "superblock": true, 00:23:01.319 "num_base_bdevs": 2, 00:23:01.319 "num_base_bdevs_discovered": 1, 00:23:01.319 "num_base_bdevs_operational": 1, 00:23:01.319 "base_bdevs_list": [ 00:23:01.319 { 00:23:01.319 "name": null, 00:23:01.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.319 "is_configured": false, 00:23:01.319 "data_offset": 2048, 00:23:01.319 "data_size": 63488 00:23:01.319 }, 00:23:01.319 { 00:23:01.319 "name": "BaseBdev2", 00:23:01.319 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:01.319 "is_configured": true, 00:23:01.319 "data_offset": 2048, 00:23:01.319 "data_size": 63488 00:23:01.319 } 00:23:01.319 ] 00:23:01.319 }' 00:23:01.319 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.319 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:01.319 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.576 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.576 13:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:01.832 13:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:01.832 [2024-07-15 13:41:41.248943] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:01.832 [2024-07-15 13:41:41.248996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:01.832 [2024-07-15 13:41:41.249018] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x270a650 00:23:01.832 [2024-07-15 13:41:41.249031] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:01.832 [2024-07-15 13:41:41.249396] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:01.832 [2024-07-15 13:41:41.249414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:01.832 [2024-07-15 13:41:41.249480] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:01.832 [2024-07-15 13:41:41.249493] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:01.832 [2024-07-15 13:41:41.249504] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:01.832 BaseBdev1 00:23:02.096 13:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.026 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.027 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.284 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.284 "name": "raid_bdev1", 00:23:03.284 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:03.284 "strip_size_kb": 0, 00:23:03.284 "state": "online", 00:23:03.284 "raid_level": "raid1", 00:23:03.284 "superblock": true, 00:23:03.284 "num_base_bdevs": 2, 00:23:03.284 "num_base_bdevs_discovered": 1, 00:23:03.284 "num_base_bdevs_operational": 1, 00:23:03.284 "base_bdevs_list": [ 00:23:03.284 { 00:23:03.284 "name": null, 00:23:03.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.284 "is_configured": false, 00:23:03.284 "data_offset": 2048, 00:23:03.284 "data_size": 63488 00:23:03.284 }, 00:23:03.284 { 00:23:03.285 "name": "BaseBdev2", 00:23:03.285 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:03.285 "is_configured": true, 00:23:03.285 "data_offset": 2048, 00:23:03.285 "data_size": 63488 00:23:03.285 } 00:23:03.285 ] 00:23:03.285 }' 00:23:03.285 13:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.285 13:41:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.849 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.108 "name": "raid_bdev1", 00:23:04.108 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:04.108 "strip_size_kb": 0, 00:23:04.108 "state": "online", 00:23:04.108 "raid_level": "raid1", 00:23:04.108 "superblock": true, 00:23:04.108 "num_base_bdevs": 2, 00:23:04.108 "num_base_bdevs_discovered": 1, 00:23:04.108 "num_base_bdevs_operational": 1, 00:23:04.108 "base_bdevs_list": [ 00:23:04.108 { 00:23:04.108 "name": null, 00:23:04.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.108 "is_configured": false, 00:23:04.108 "data_offset": 2048, 00:23:04.108 "data_size": 63488 00:23:04.108 }, 00:23:04.108 { 00:23:04.108 "name": "BaseBdev2", 00:23:04.108 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:04.108 "is_configured": true, 00:23:04.108 "data_offset": 2048, 00:23:04.108 "data_size": 63488 00:23:04.108 } 00:23:04.108 ] 00:23:04.108 }' 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:04.108 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:04.366 [2024-07-15 13:41:43.683394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:04.366 [2024-07-15 13:41:43.683528] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:04.366 [2024-07-15 13:41:43.683544] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:04.366 request: 00:23:04.366 { 00:23:04.366 "base_bdev": "BaseBdev1", 00:23:04.366 "raid_bdev": "raid_bdev1", 00:23:04.366 "method": "bdev_raid_add_base_bdev", 00:23:04.366 "req_id": 1 00:23:04.366 } 00:23:04.366 Got JSON-RPC error response 00:23:04.366 response: 00:23:04.366 { 00:23:04.366 "code": -22, 00:23:04.366 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:04.366 } 00:23:04.366 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:04.366 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:04.366 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:04.366 13:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:04.366 13:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.299 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.556 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.556 "name": "raid_bdev1", 00:23:05.556 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:05.556 "strip_size_kb": 0, 00:23:05.556 "state": "online", 00:23:05.556 "raid_level": "raid1", 00:23:05.556 "superblock": true, 00:23:05.556 "num_base_bdevs": 2, 00:23:05.556 "num_base_bdevs_discovered": 1, 00:23:05.556 "num_base_bdevs_operational": 1, 00:23:05.556 "base_bdevs_list": [ 00:23:05.556 { 00:23:05.556 "name": null, 00:23:05.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.556 "is_configured": false, 00:23:05.556 "data_offset": 2048, 00:23:05.556 "data_size": 63488 00:23:05.556 }, 00:23:05.556 { 00:23:05.556 "name": "BaseBdev2", 00:23:05.556 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:05.556 "is_configured": true, 00:23:05.556 "data_offset": 2048, 00:23:05.556 "data_size": 63488 00:23:05.556 } 00:23:05.556 ] 00:23:05.556 }' 00:23:05.556 13:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.556 13:41:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.534 "name": "raid_bdev1", 00:23:06.534 "uuid": "8aa34952-9cb2-4e70-b648-cf3eacc5ad67", 00:23:06.534 "strip_size_kb": 0, 00:23:06.534 "state": "online", 00:23:06.534 "raid_level": "raid1", 00:23:06.534 "superblock": true, 00:23:06.534 "num_base_bdevs": 2, 00:23:06.534 "num_base_bdevs_discovered": 1, 00:23:06.534 "num_base_bdevs_operational": 1, 00:23:06.534 "base_bdevs_list": [ 00:23:06.534 { 00:23:06.534 "name": null, 00:23:06.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.534 "is_configured": false, 00:23:06.534 "data_offset": 2048, 00:23:06.534 "data_size": 63488 00:23:06.534 }, 00:23:06.534 { 00:23:06.534 "name": "BaseBdev2", 00:23:06.534 "uuid": "d9d6bb3a-8c52-5cb2-96a0-10362f9cfc8c", 00:23:06.534 "is_configured": true, 00:23:06.534 "data_offset": 2048, 00:23:06.534 "data_size": 63488 00:23:06.534 } 00:23:06.534 ] 00:23:06.534 }' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2177103 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2177103 ']' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2177103 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2177103 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2177103' 00:23:06.534 killing process with pid 2177103 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2177103 00:23:06.534 Received shutdown signal, test time was about 60.000000 seconds 00:23:06.534 00:23:06.534 Latency(us) 00:23:06.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:06.534 =================================================================================================================== 00:23:06.534 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:06.534 [2024-07-15 13:41:45.891956] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:06.534 [2024-07-15 13:41:45.892062] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:06.534 [2024-07-15 13:41:45.892109] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:06.534 13:41:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2177103 00:23:06.534 [2024-07-15 13:41:45.892121] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x270e260 name raid_bdev1, state offline 00:23:06.534 [2024-07-15 13:41:45.923124] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.792 13:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:06.792 00:23:06.792 real 0m36.319s 00:23:06.792 user 0m52.249s 00:23:06.792 sys 0m7.074s 00:23:06.792 13:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:06.792 13:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.792 ************************************ 00:23:06.792 END TEST raid_rebuild_test_sb 00:23:06.792 ************************************ 00:23:06.792 13:41:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:06.792 13:41:46 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:06.792 13:41:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:06.792 13:41:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:06.792 13:41:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:07.050 ************************************ 00:23:07.050 START TEST raid_rebuild_test_io 00:23:07.050 ************************************ 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2182222 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2182222 /var/tmp/spdk-raid.sock 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2182222 ']' 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:07.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:07.050 13:41:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:07.050 [2024-07-15 13:41:46.301822] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:23:07.050 [2024-07-15 13:41:46.301887] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2182222 ] 00:23:07.050 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:07.050 Zero copy mechanism will not be used. 00:23:07.050 [2024-07-15 13:41:46.431455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:07.308 [2024-07-15 13:41:46.539628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:07.308 [2024-07-15 13:41:46.606642] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.308 [2024-07-15 13:41:46.606684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.873 13:41:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:07.873 13:41:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:07.873 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:07.873 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:08.130 BaseBdev1_malloc 00:23:08.130 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:08.389 [2024-07-15 13:41:47.650012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:08.389 [2024-07-15 13:41:47.650061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.389 [2024-07-15 13:41:47.650086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x90bd40 00:23:08.389 [2024-07-15 13:41:47.650099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.389 [2024-07-15 13:41:47.651757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.389 [2024-07-15 13:41:47.651785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:08.389 BaseBdev1 00:23:08.389 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:08.389 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:08.647 BaseBdev2_malloc 00:23:08.647 13:41:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:08.904 [2024-07-15 13:41:48.096054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:08.904 [2024-07-15 13:41:48.096104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.904 [2024-07-15 13:41:48.096133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x90c860 00:23:08.904 [2024-07-15 13:41:48.096145] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.904 [2024-07-15 13:41:48.097740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.904 [2024-07-15 13:41:48.097769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:08.904 BaseBdev2 00:23:08.904 13:41:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:09.162 spare_malloc 00:23:09.162 13:41:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:09.419 spare_delay 00:23:09.420 13:41:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:09.420 [2024-07-15 13:41:48.831870] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:09.420 [2024-07-15 13:41:48.831918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.420 [2024-07-15 13:41:48.831946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xabaec0 00:23:09.420 [2024-07-15 13:41:48.831959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.420 [2024-07-15 13:41:48.833562] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.420 [2024-07-15 13:41:48.833591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:09.420 spare 00:23:09.677 13:41:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:09.677 [2024-07-15 13:41:49.064511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:09.677 [2024-07-15 13:41:49.065818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:09.677 [2024-07-15 13:41:49.065901] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xabc070 00:23:09.677 [2024-07-15 13:41:49.065912] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:09.677 [2024-07-15 13:41:49.066128] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab5490 00:23:09.677 [2024-07-15 13:41:49.066271] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xabc070 00:23:09.677 [2024-07-15 13:41:49.066281] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xabc070 00:23:09.677 [2024-07-15 13:41:49.066403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.677 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.935 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.935 "name": "raid_bdev1", 00:23:09.935 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:09.935 "strip_size_kb": 0, 00:23:09.935 "state": "online", 00:23:09.935 "raid_level": "raid1", 00:23:09.935 "superblock": false, 00:23:09.935 "num_base_bdevs": 2, 00:23:09.935 "num_base_bdevs_discovered": 2, 00:23:09.935 "num_base_bdevs_operational": 2, 00:23:09.935 "base_bdevs_list": [ 00:23:09.935 { 00:23:09.935 "name": "BaseBdev1", 00:23:09.935 "uuid": "94dfff20-20bd-5381-922e-6a69b0530a82", 00:23:09.935 "is_configured": true, 00:23:09.935 "data_offset": 0, 00:23:09.935 "data_size": 65536 00:23:09.935 }, 00:23:09.935 { 00:23:09.935 "name": "BaseBdev2", 00:23:09.935 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:09.935 "is_configured": true, 00:23:09.935 "data_offset": 0, 00:23:09.935 "data_size": 65536 00:23:09.935 } 00:23:09.935 ] 00:23:09.935 }' 00:23:09.935 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.935 13:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:10.867 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:10.867 13:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:10.867 [2024-07-15 13:41:50.199773] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:10.867 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:10.867 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.867 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:11.124 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:11.124 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:11.124 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:11.124 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:11.382 [2024-07-15 13:41:50.574666] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab6bd0 00:23:11.382 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:11.382 Zero copy mechanism will not be used. 00:23:11.382 Running I/O for 60 seconds... 00:23:11.382 [2024-07-15 13:41:50.624975] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:11.382 [2024-07-15 13:41:50.633112] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xab6bd0 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.382 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.640 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.640 "name": "raid_bdev1", 00:23:11.640 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:11.640 "strip_size_kb": 0, 00:23:11.640 "state": "online", 00:23:11.640 "raid_level": "raid1", 00:23:11.640 "superblock": false, 00:23:11.640 "num_base_bdevs": 2, 00:23:11.640 "num_base_bdevs_discovered": 1, 00:23:11.640 "num_base_bdevs_operational": 1, 00:23:11.640 "base_bdevs_list": [ 00:23:11.640 { 00:23:11.640 "name": null, 00:23:11.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.640 "is_configured": false, 00:23:11.640 "data_offset": 0, 00:23:11.640 "data_size": 65536 00:23:11.640 }, 00:23:11.640 { 00:23:11.640 "name": "BaseBdev2", 00:23:11.640 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:11.640 "is_configured": true, 00:23:11.640 "data_offset": 0, 00:23:11.640 "data_size": 65536 00:23:11.640 } 00:23:11.640 ] 00:23:11.640 }' 00:23:11.640 13:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.640 13:41:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:12.205 13:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:12.463 [2024-07-15 13:41:51.786537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:12.463 13:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:12.463 [2024-07-15 13:41:51.853465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa3e8b0 00:23:12.463 [2024-07-15 13:41:51.855796] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:12.721 [2024-07-15 13:41:51.966980] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:13.287 [2024-07-15 13:41:52.518201] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:13.546 [2024-07-15 13:41:52.780825] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.546 13:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.546 [2024-07-15 13:41:52.916914] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:13.805 "name": "raid_bdev1", 00:23:13.805 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:13.805 "strip_size_kb": 0, 00:23:13.805 "state": "online", 00:23:13.805 "raid_level": "raid1", 00:23:13.805 "superblock": false, 00:23:13.805 "num_base_bdevs": 2, 00:23:13.805 "num_base_bdevs_discovered": 2, 00:23:13.805 "num_base_bdevs_operational": 2, 00:23:13.805 "process": { 00:23:13.805 "type": "rebuild", 00:23:13.805 "target": "spare", 00:23:13.805 "progress": { 00:23:13.805 "blocks": 16384, 00:23:13.805 "percent": 25 00:23:13.805 } 00:23:13.805 }, 00:23:13.805 "base_bdevs_list": [ 00:23:13.805 { 00:23:13.805 "name": "spare", 00:23:13.805 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:13.805 "is_configured": true, 00:23:13.805 "data_offset": 0, 00:23:13.805 "data_size": 65536 00:23:13.805 }, 00:23:13.805 { 00:23:13.805 "name": "BaseBdev2", 00:23:13.805 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:13.805 "is_configured": true, 00:23:13.805 "data_offset": 0, 00:23:13.805 "data_size": 65536 00:23:13.805 } 00:23:13.805 ] 00:23:13.805 }' 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:13.805 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:14.062 [2024-07-15 13:41:53.415000] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.320 [2024-07-15 13:41:53.536683] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:14.320 [2024-07-15 13:41:53.546689] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.320 [2024-07-15 13:41:53.546726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.320 [2024-07-15 13:41:53.546737] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:14.320 [2024-07-15 13:41:53.576819] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xab6bd0 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.320 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.584 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.584 "name": "raid_bdev1", 00:23:14.584 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:14.584 "strip_size_kb": 0, 00:23:14.584 "state": "online", 00:23:14.584 "raid_level": "raid1", 00:23:14.584 "superblock": false, 00:23:14.584 "num_base_bdevs": 2, 00:23:14.584 "num_base_bdevs_discovered": 1, 00:23:14.584 "num_base_bdevs_operational": 1, 00:23:14.584 "base_bdevs_list": [ 00:23:14.584 { 00:23:14.584 "name": null, 00:23:14.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.584 "is_configured": false, 00:23:14.584 "data_offset": 0, 00:23:14.584 "data_size": 65536 00:23:14.584 }, 00:23:14.584 { 00:23:14.584 "name": "BaseBdev2", 00:23:14.584 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:14.584 "is_configured": true, 00:23:14.584 "data_offset": 0, 00:23:14.584 "data_size": 65536 00:23:14.584 } 00:23:14.584 ] 00:23:14.584 }' 00:23:14.584 13:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.584 13:41:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.151 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.410 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.410 "name": "raid_bdev1", 00:23:15.410 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:15.410 "strip_size_kb": 0, 00:23:15.410 "state": "online", 00:23:15.410 "raid_level": "raid1", 00:23:15.410 "superblock": false, 00:23:15.410 "num_base_bdevs": 2, 00:23:15.410 "num_base_bdevs_discovered": 1, 00:23:15.410 "num_base_bdevs_operational": 1, 00:23:15.410 "base_bdevs_list": [ 00:23:15.410 { 00:23:15.410 "name": null, 00:23:15.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.410 "is_configured": false, 00:23:15.410 "data_offset": 0, 00:23:15.410 "data_size": 65536 00:23:15.410 }, 00:23:15.410 { 00:23:15.410 "name": "BaseBdev2", 00:23:15.410 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:15.410 "is_configured": true, 00:23:15.410 "data_offset": 0, 00:23:15.410 "data_size": 65536 00:23:15.410 } 00:23:15.410 ] 00:23:15.411 }' 00:23:15.411 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.411 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:15.411 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:15.411 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:15.411 13:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:15.669 [2024-07-15 13:41:55.004754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:15.669 13:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:15.669 [2024-07-15 13:41:55.080981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xabc450 00:23:15.669 [2024-07-15 13:41:55.082503] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:15.928 [2024-07-15 13:41:55.191304] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:16.187 [2024-07-15 13:41:55.457629] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:16.187 [2024-07-15 13:41:55.457814] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:16.445 [2024-07-15 13:41:55.730508] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:16.445 [2024-07-15 13:41:55.866605] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.703 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.962 [2024-07-15 13:41:56.155824] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:16.962 [2024-07-15 13:41:56.156353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:16.962 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.962 "name": "raid_bdev1", 00:23:16.962 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:16.962 "strip_size_kb": 0, 00:23:16.962 "state": "online", 00:23:16.962 "raid_level": "raid1", 00:23:16.962 "superblock": false, 00:23:16.962 "num_base_bdevs": 2, 00:23:16.962 "num_base_bdevs_discovered": 2, 00:23:16.962 "num_base_bdevs_operational": 2, 00:23:16.962 "process": { 00:23:16.962 "type": "rebuild", 00:23:16.962 "target": "spare", 00:23:16.962 "progress": { 00:23:16.962 "blocks": 14336, 00:23:16.962 "percent": 21 00:23:16.962 } 00:23:16.962 }, 00:23:16.962 "base_bdevs_list": [ 00:23:16.962 { 00:23:16.962 "name": "spare", 00:23:16.962 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:16.962 "is_configured": true, 00:23:16.962 "data_offset": 0, 00:23:16.962 "data_size": 65536 00:23:16.962 }, 00:23:16.962 { 00:23:16.962 "name": "BaseBdev2", 00:23:16.962 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:16.962 "is_configured": true, 00:23:16.962 "data_offset": 0, 00:23:16.962 "data_size": 65536 00:23:16.962 } 00:23:16.962 ] 00:23:16.962 }' 00:23:16.962 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.962 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.962 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.962 [2024-07-15 13:41:56.374915] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:16.962 [2024-07-15 13:41:56.375217] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=820 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.221 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.479 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.479 "name": "raid_bdev1", 00:23:17.479 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:17.479 "strip_size_kb": 0, 00:23:17.479 "state": "online", 00:23:17.479 "raid_level": "raid1", 00:23:17.479 "superblock": false, 00:23:17.479 "num_base_bdevs": 2, 00:23:17.479 "num_base_bdevs_discovered": 2, 00:23:17.479 "num_base_bdevs_operational": 2, 00:23:17.479 "process": { 00:23:17.479 "type": "rebuild", 00:23:17.479 "target": "spare", 00:23:17.479 "progress": { 00:23:17.479 "blocks": 18432, 00:23:17.479 "percent": 28 00:23:17.479 } 00:23:17.479 }, 00:23:17.479 "base_bdevs_list": [ 00:23:17.479 { 00:23:17.479 "name": "spare", 00:23:17.479 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:17.479 "is_configured": true, 00:23:17.479 "data_offset": 0, 00:23:17.479 "data_size": 65536 00:23:17.479 }, 00:23:17.479 { 00:23:17.480 "name": "BaseBdev2", 00:23:17.480 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:17.480 "is_configured": true, 00:23:17.480 "data_offset": 0, 00:23:17.480 "data_size": 65536 00:23:17.480 } 00:23:17.480 ] 00:23:17.480 }' 00:23:17.480 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.480 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.480 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.480 [2024-07-15 13:41:56.732680] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:17.480 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.480 13:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:17.480 [2024-07-15 13:41:56.852578] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:18.047 [2024-07-15 13:41:57.454109] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:18.305 [2024-07-15 13:41:57.698597] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.563 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.822 13:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.822 "name": "raid_bdev1", 00:23:18.822 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:18.822 "strip_size_kb": 0, 00:23:18.822 "state": "online", 00:23:18.822 "raid_level": "raid1", 00:23:18.823 "superblock": false, 00:23:18.823 "num_base_bdevs": 2, 00:23:18.823 "num_base_bdevs_discovered": 2, 00:23:18.823 "num_base_bdevs_operational": 2, 00:23:18.823 "process": { 00:23:18.823 "type": "rebuild", 00:23:18.823 "target": "spare", 00:23:18.823 "progress": { 00:23:18.823 "blocks": 36864, 00:23:18.823 "percent": 56 00:23:18.823 } 00:23:18.823 }, 00:23:18.823 "base_bdevs_list": [ 00:23:18.823 { 00:23:18.823 "name": "spare", 00:23:18.823 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:18.823 "is_configured": true, 00:23:18.823 "data_offset": 0, 00:23:18.823 "data_size": 65536 00:23:18.823 }, 00:23:18.823 { 00:23:18.823 "name": "BaseBdev2", 00:23:18.823 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:18.823 "is_configured": true, 00:23:18.823 "data_offset": 0, 00:23:18.823 "data_size": 65536 00:23:18.823 } 00:23:18.823 ] 00:23:18.823 }' 00:23:18.823 13:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.823 13:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:18.823 13:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.823 13:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.823 13:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.758 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.016 "name": "raid_bdev1", 00:23:20.016 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:20.016 "strip_size_kb": 0, 00:23:20.016 "state": "online", 00:23:20.016 "raid_level": "raid1", 00:23:20.016 "superblock": false, 00:23:20.016 "num_base_bdevs": 2, 00:23:20.016 "num_base_bdevs_discovered": 2, 00:23:20.016 "num_base_bdevs_operational": 2, 00:23:20.016 "process": { 00:23:20.016 "type": "rebuild", 00:23:20.016 "target": "spare", 00:23:20.016 "progress": { 00:23:20.016 "blocks": 59392, 00:23:20.016 "percent": 90 00:23:20.016 } 00:23:20.016 }, 00:23:20.016 "base_bdevs_list": [ 00:23:20.016 { 00:23:20.016 "name": "spare", 00:23:20.016 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:20.016 "is_configured": true, 00:23:20.016 "data_offset": 0, 00:23:20.016 "data_size": 65536 00:23:20.016 }, 00:23:20.016 { 00:23:20.016 "name": "BaseBdev2", 00:23:20.016 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:20.016 "is_configured": true, 00:23:20.016 "data_offset": 0, 00:23:20.016 "data_size": 65536 00:23:20.016 } 00:23:20.016 ] 00:23:20.016 }' 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:20.016 13:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:20.274 [2024-07-15 13:41:59.481154] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:20.274 [2024-07-15 13:41:59.581410] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:20.274 [2024-07-15 13:41:59.582993] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.206 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.206 "name": "raid_bdev1", 00:23:21.206 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:21.206 "strip_size_kb": 0, 00:23:21.206 "state": "online", 00:23:21.206 "raid_level": "raid1", 00:23:21.206 "superblock": false, 00:23:21.206 "num_base_bdevs": 2, 00:23:21.206 "num_base_bdevs_discovered": 2, 00:23:21.206 "num_base_bdevs_operational": 2, 00:23:21.206 "base_bdevs_list": [ 00:23:21.206 { 00:23:21.206 "name": "spare", 00:23:21.206 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:21.206 "is_configured": true, 00:23:21.206 "data_offset": 0, 00:23:21.206 "data_size": 65536 00:23:21.206 }, 00:23:21.206 { 00:23:21.206 "name": "BaseBdev2", 00:23:21.206 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:21.206 "is_configured": true, 00:23:21.206 "data_offset": 0, 00:23:21.206 "data_size": 65536 00:23:21.206 } 00:23:21.206 ] 00:23:21.206 }' 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.463 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.721 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.721 "name": "raid_bdev1", 00:23:21.721 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:21.721 "strip_size_kb": 0, 00:23:21.721 "state": "online", 00:23:21.721 "raid_level": "raid1", 00:23:21.721 "superblock": false, 00:23:21.721 "num_base_bdevs": 2, 00:23:21.721 "num_base_bdevs_discovered": 2, 00:23:21.721 "num_base_bdevs_operational": 2, 00:23:21.721 "base_bdevs_list": [ 00:23:21.721 { 00:23:21.721 "name": "spare", 00:23:21.721 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:21.721 "is_configured": true, 00:23:21.721 "data_offset": 0, 00:23:21.721 "data_size": 65536 00:23:21.721 }, 00:23:21.721 { 00:23:21.721 "name": "BaseBdev2", 00:23:21.721 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:21.721 "is_configured": true, 00:23:21.721 "data_offset": 0, 00:23:21.721 "data_size": 65536 00:23:21.721 } 00:23:21.721 ] 00:23:21.721 }' 00:23:21.721 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.721 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:21.721 13:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.721 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:21.721 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.722 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.979 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.980 "name": "raid_bdev1", 00:23:21.980 "uuid": "24edc15c-13be-45b9-b146-619fcf9b8304", 00:23:21.980 "strip_size_kb": 0, 00:23:21.980 "state": "online", 00:23:21.980 "raid_level": "raid1", 00:23:21.980 "superblock": false, 00:23:21.980 "num_base_bdevs": 2, 00:23:21.980 "num_base_bdevs_discovered": 2, 00:23:21.980 "num_base_bdevs_operational": 2, 00:23:21.980 "base_bdevs_list": [ 00:23:21.980 { 00:23:21.980 "name": "spare", 00:23:21.980 "uuid": "19a9306b-5b39-5107-b900-295ada3a815c", 00:23:21.980 "is_configured": true, 00:23:21.980 "data_offset": 0, 00:23:21.980 "data_size": 65536 00:23:21.980 }, 00:23:21.980 { 00:23:21.980 "name": "BaseBdev2", 00:23:21.980 "uuid": "0d476032-87e7-5ec6-8eb4-aecf7c46015b", 00:23:21.980 "is_configured": true, 00:23:21.980 "data_offset": 0, 00:23:21.980 "data_size": 65536 00:23:21.980 } 00:23:21.980 ] 00:23:21.980 }' 00:23:21.980 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.980 13:42:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:22.545 13:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:22.804 [2024-07-15 13:42:02.123036] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:22.804 [2024-07-15 13:42:02.123068] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:22.804 00:23:22.804 Latency(us) 00:23:22.804 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:22.804 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:22.804 raid_bdev1 : 11.62 102.77 308.32 0.00 0.00 13950.16 284.94 119446.48 00:23:22.804 =================================================================================================================== 00:23:22.804 Total : 102.77 308.32 0.00 0.00 13950.16 284.94 119446.48 00:23:22.804 [2024-07-15 13:42:02.227292] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:22.804 [2024-07-15 13:42:02.227321] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:22.804 [2024-07-15 13:42:02.227398] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:22.804 [2024-07-15 13:42:02.227411] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xabc070 name raid_bdev1, state offline 00:23:22.804 0 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:23.100 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:23.400 /dev/nbd0 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:23.400 1+0 records in 00:23:23.400 1+0 records out 00:23:23.400 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279617 s, 14.6 MB/s 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:23.400 13:42:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:23.659 /dev/nbd1 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:23.659 1+0 records in 00:23:23.659 1+0 records out 00:23:23.659 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208396 s, 19.7 MB/s 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:23.659 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:23.917 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:23.918 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:24.176 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2182222 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2182222 ']' 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2182222 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2182222 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2182222' 00:23:24.435 killing process with pid 2182222 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2182222 00:23:24.435 Received shutdown signal, test time was about 13.147924 seconds 00:23:24.435 00:23:24.435 Latency(us) 00:23:24.435 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:24.435 =================================================================================================================== 00:23:24.435 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:24.435 [2024-07-15 13:42:03.757018] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:24.435 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2182222 00:23:24.435 [2024-07-15 13:42:03.777803] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:24.694 13:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:24.694 00:23:24.694 real 0m17.755s 00:23:24.694 user 0m26.883s 00:23:24.694 sys 0m2.810s 00:23:24.694 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:24.694 13:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:24.694 ************************************ 00:23:24.694 END TEST raid_rebuild_test_io 00:23:24.694 ************************************ 00:23:24.694 13:42:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:24.694 13:42:04 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:24.694 13:42:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:24.694 13:42:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:24.694 13:42:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:24.694 ************************************ 00:23:24.694 START TEST raid_rebuild_test_sb_io 00:23:24.694 ************************************ 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2184860 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2184860 /var/tmp/spdk-raid.sock 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2184860 ']' 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:24.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:24.694 13:42:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:24.952 [2024-07-15 13:42:04.136757] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:23:24.952 [2024-07-15 13:42:04.136821] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2184860 ] 00:23:24.952 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:24.952 Zero copy mechanism will not be used. 00:23:24.952 [2024-07-15 13:42:04.266279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:24.952 [2024-07-15 13:42:04.369922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.210 [2024-07-15 13:42:04.440345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:25.210 [2024-07-15 13:42:04.440381] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:25.793 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:25.793 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:25.793 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:25.793 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:26.051 BaseBdev1_malloc 00:23:26.051 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:26.310 [2024-07-15 13:42:05.499276] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:26.310 [2024-07-15 13:42:05.499326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.310 [2024-07-15 13:42:05.499351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bddd40 00:23:26.310 [2024-07-15 13:42:05.499364] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.310 [2024-07-15 13:42:05.501055] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.310 [2024-07-15 13:42:05.501084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:26.310 BaseBdev1 00:23:26.310 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:26.310 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:26.568 BaseBdev2_malloc 00:23:26.568 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:26.568 [2024-07-15 13:42:05.938126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:26.568 [2024-07-15 13:42:05.938174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.568 [2024-07-15 13:42:05.938198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bde860 00:23:26.568 [2024-07-15 13:42:05.938212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.568 [2024-07-15 13:42:05.939713] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.568 [2024-07-15 13:42:05.939742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:26.568 BaseBdev2 00:23:26.568 13:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:26.826 spare_malloc 00:23:26.826 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:27.085 spare_delay 00:23:27.085 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:27.343 [2024-07-15 13:42:06.689963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:27.343 [2024-07-15 13:42:06.690011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.343 [2024-07-15 13:42:06.690033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d8cec0 00:23:27.343 [2024-07-15 13:42:06.690051] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.343 [2024-07-15 13:42:06.691641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.343 [2024-07-15 13:42:06.691670] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:27.343 spare 00:23:27.343 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:27.612 [2024-07-15 13:42:06.930617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:27.612 [2024-07-15 13:42:06.932022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:27.612 [2024-07-15 13:42:06.932205] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d8e070 00:23:27.612 [2024-07-15 13:42:06.932219] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:27.612 [2024-07-15 13:42:06.932420] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d87490 00:23:27.612 [2024-07-15 13:42:06.932564] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d8e070 00:23:27.612 [2024-07-15 13:42:06.932574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d8e070 00:23:27.612 [2024-07-15 13:42:06.932679] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.612 13:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.870 13:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.870 "name": "raid_bdev1", 00:23:27.870 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:27.870 "strip_size_kb": 0, 00:23:27.870 "state": "online", 00:23:27.870 "raid_level": "raid1", 00:23:27.870 "superblock": true, 00:23:27.870 "num_base_bdevs": 2, 00:23:27.870 "num_base_bdevs_discovered": 2, 00:23:27.870 "num_base_bdevs_operational": 2, 00:23:27.870 "base_bdevs_list": [ 00:23:27.870 { 00:23:27.870 "name": "BaseBdev1", 00:23:27.870 "uuid": "ca90f90f-7861-5b59-939a-9003bbb9bfaf", 00:23:27.870 "is_configured": true, 00:23:27.870 "data_offset": 2048, 00:23:27.870 "data_size": 63488 00:23:27.870 }, 00:23:27.870 { 00:23:27.870 "name": "BaseBdev2", 00:23:27.870 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:27.870 "is_configured": true, 00:23:27.870 "data_offset": 2048, 00:23:27.870 "data_size": 63488 00:23:27.870 } 00:23:27.870 ] 00:23:27.870 }' 00:23:27.870 13:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.870 13:42:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:28.437 13:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:28.437 13:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:28.695 [2024-07-15 13:42:08.017876] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:28.695 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:28.695 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.695 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:28.954 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:28.954 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:28.954 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:28.954 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:29.212 [2024-07-15 13:42:08.396728] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d8ec50 00:23:29.212 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:29.212 Zero copy mechanism will not be used. 00:23:29.212 Running I/O for 60 seconds... 00:23:29.212 [2024-07-15 13:42:08.510921] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:29.212 [2024-07-15 13:42:08.530009] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d8ec50 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.212 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.470 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:29.470 "name": "raid_bdev1", 00:23:29.470 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:29.470 "strip_size_kb": 0, 00:23:29.470 "state": "online", 00:23:29.470 "raid_level": "raid1", 00:23:29.470 "superblock": true, 00:23:29.470 "num_base_bdevs": 2, 00:23:29.470 "num_base_bdevs_discovered": 1, 00:23:29.470 "num_base_bdevs_operational": 1, 00:23:29.470 "base_bdevs_list": [ 00:23:29.470 { 00:23:29.470 "name": null, 00:23:29.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.470 "is_configured": false, 00:23:29.470 "data_offset": 2048, 00:23:29.470 "data_size": 63488 00:23:29.470 }, 00:23:29.470 { 00:23:29.470 "name": "BaseBdev2", 00:23:29.470 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:29.470 "is_configured": true, 00:23:29.470 "data_offset": 2048, 00:23:29.470 "data_size": 63488 00:23:29.470 } 00:23:29.470 ] 00:23:29.470 }' 00:23:29.470 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:29.470 13:42:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:30.035 13:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:30.293 [2024-07-15 13:42:09.541327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:30.293 13:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:30.293 [2024-07-15 13:42:09.601239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfa230 00:23:30.293 [2024-07-15 13:42:09.603720] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:30.551 [2024-07-15 13:42:09.736409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:30.551 [2024-07-15 13:42:09.736841] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:30.551 [2024-07-15 13:42:09.973233] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:30.551 [2024-07-15 13:42:09.973506] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:31.115 [2024-07-15 13:42:10.325621] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:31.115 [2024-07-15 13:42:10.440075] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.372 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.372 [2024-07-15 13:42:10.740980] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:31.630 [2024-07-15 13:42:10.853080] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:31.630 [2024-07-15 13:42:10.853291] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:31.630 "name": "raid_bdev1", 00:23:31.630 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:31.630 "strip_size_kb": 0, 00:23:31.630 "state": "online", 00:23:31.630 "raid_level": "raid1", 00:23:31.630 "superblock": true, 00:23:31.630 "num_base_bdevs": 2, 00:23:31.630 "num_base_bdevs_discovered": 2, 00:23:31.630 "num_base_bdevs_operational": 2, 00:23:31.630 "process": { 00:23:31.630 "type": "rebuild", 00:23:31.630 "target": "spare", 00:23:31.630 "progress": { 00:23:31.630 "blocks": 14336, 00:23:31.630 "percent": 22 00:23:31.630 } 00:23:31.630 }, 00:23:31.630 "base_bdevs_list": [ 00:23:31.630 { 00:23:31.630 "name": "spare", 00:23:31.630 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:31.630 "is_configured": true, 00:23:31.630 "data_offset": 2048, 00:23:31.630 "data_size": 63488 00:23:31.630 }, 00:23:31.630 { 00:23:31.630 "name": "BaseBdev2", 00:23:31.630 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:31.630 "is_configured": true, 00:23:31.630 "data_offset": 2048, 00:23:31.630 "data_size": 63488 00:23:31.630 } 00:23:31.630 ] 00:23:31.630 }' 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:31.630 13:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:31.888 [2024-07-15 13:42:11.159590] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.888 [2024-07-15 13:42:11.199639] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:31.888 [2024-07-15 13:42:11.307967] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:32.146 [2024-07-15 13:42:11.318088] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.146 [2024-07-15 13:42:11.318119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.146 [2024-07-15 13:42:11.318130] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:32.146 [2024-07-15 13:42:11.346395] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d8ec50 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.146 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.404 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.404 "name": "raid_bdev1", 00:23:32.404 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:32.404 "strip_size_kb": 0, 00:23:32.404 "state": "online", 00:23:32.404 "raid_level": "raid1", 00:23:32.404 "superblock": true, 00:23:32.404 "num_base_bdevs": 2, 00:23:32.404 "num_base_bdevs_discovered": 1, 00:23:32.404 "num_base_bdevs_operational": 1, 00:23:32.404 "base_bdevs_list": [ 00:23:32.404 { 00:23:32.404 "name": null, 00:23:32.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.404 "is_configured": false, 00:23:32.404 "data_offset": 2048, 00:23:32.404 "data_size": 63488 00:23:32.404 }, 00:23:32.404 { 00:23:32.404 "name": "BaseBdev2", 00:23:32.404 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:32.404 "is_configured": true, 00:23:32.404 "data_offset": 2048, 00:23:32.404 "data_size": 63488 00:23:32.404 } 00:23:32.404 ] 00:23:32.404 }' 00:23:32.404 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.404 13:42:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.969 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.226 "name": "raid_bdev1", 00:23:33.226 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:33.226 "strip_size_kb": 0, 00:23:33.226 "state": "online", 00:23:33.226 "raid_level": "raid1", 00:23:33.226 "superblock": true, 00:23:33.226 "num_base_bdevs": 2, 00:23:33.226 "num_base_bdevs_discovered": 1, 00:23:33.226 "num_base_bdevs_operational": 1, 00:23:33.226 "base_bdevs_list": [ 00:23:33.226 { 00:23:33.226 "name": null, 00:23:33.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.226 "is_configured": false, 00:23:33.226 "data_offset": 2048, 00:23:33.226 "data_size": 63488 00:23:33.226 }, 00:23:33.226 { 00:23:33.226 "name": "BaseBdev2", 00:23:33.226 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:33.226 "is_configured": true, 00:23:33.226 "data_offset": 2048, 00:23:33.226 "data_size": 63488 00:23:33.226 } 00:23:33.226 ] 00:23:33.226 }' 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.226 13:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:33.815 [2024-07-15 13:42:13.104049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:33.815 13:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:33.815 [2024-07-15 13:42:13.176092] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d8ee60 00:23:33.815 [2024-07-15 13:42:13.177594] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:34.379 [2024-07-15 13:42:13.499360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:34.640 [2024-07-15 13:42:13.875219] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:34.640 [2024-07-15 13:42:13.875618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:34.898 [2024-07-15 13:42:14.123884] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:34.898 [2024-07-15 13:42:14.124077] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.898 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.155 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.155 "name": "raid_bdev1", 00:23:35.155 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:35.155 "strip_size_kb": 0, 00:23:35.155 "state": "online", 00:23:35.155 "raid_level": "raid1", 00:23:35.155 "superblock": true, 00:23:35.155 "num_base_bdevs": 2, 00:23:35.155 "num_base_bdevs_discovered": 2, 00:23:35.155 "num_base_bdevs_operational": 2, 00:23:35.155 "process": { 00:23:35.155 "type": "rebuild", 00:23:35.156 "target": "spare", 00:23:35.156 "progress": { 00:23:35.156 "blocks": 12288, 00:23:35.156 "percent": 19 00:23:35.156 } 00:23:35.156 }, 00:23:35.156 "base_bdevs_list": [ 00:23:35.156 { 00:23:35.156 "name": "spare", 00:23:35.156 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:35.156 "is_configured": true, 00:23:35.156 "data_offset": 2048, 00:23:35.156 "data_size": 63488 00:23:35.156 }, 00:23:35.156 { 00:23:35.156 "name": "BaseBdev2", 00:23:35.156 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:35.156 "is_configured": true, 00:23:35.156 "data_offset": 2048, 00:23:35.156 "data_size": 63488 00:23:35.156 } 00:23:35.156 ] 00:23:35.156 }' 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.156 [2024-07-15 13:42:14.466844] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:35.156 [2024-07-15 13:42:14.467324] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:35.156 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=838 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.156 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.413 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.413 "name": "raid_bdev1", 00:23:35.413 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:35.413 "strip_size_kb": 0, 00:23:35.413 "state": "online", 00:23:35.413 "raid_level": "raid1", 00:23:35.413 "superblock": true, 00:23:35.413 "num_base_bdevs": 2, 00:23:35.413 "num_base_bdevs_discovered": 2, 00:23:35.413 "num_base_bdevs_operational": 2, 00:23:35.413 "process": { 00:23:35.413 "type": "rebuild", 00:23:35.413 "target": "spare", 00:23:35.413 "progress": { 00:23:35.413 "blocks": 18432, 00:23:35.413 "percent": 29 00:23:35.413 } 00:23:35.413 }, 00:23:35.413 "base_bdevs_list": [ 00:23:35.413 { 00:23:35.413 "name": "spare", 00:23:35.413 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:35.414 "is_configured": true, 00:23:35.414 "data_offset": 2048, 00:23:35.414 "data_size": 63488 00:23:35.414 }, 00:23:35.414 { 00:23:35.414 "name": "BaseBdev2", 00:23:35.414 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:35.414 "is_configured": true, 00:23:35.414 "data_offset": 2048, 00:23:35.414 "data_size": 63488 00:23:35.414 } 00:23:35.414 ] 00:23:35.414 }' 00:23:35.414 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.414 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:35.414 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.414 [2024-07-15 13:42:14.802769] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:35.671 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.671 13:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:35.929 [2024-07-15 13:42:15.225695] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:36.187 [2024-07-15 13:42:15.436834] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:36.444 [2024-07-15 13:42:15.684672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:36.444 [2024-07-15 13:42:15.803272] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.444 13:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.009 [2024-07-15 13:42:16.159195] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:37.010 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.010 "name": "raid_bdev1", 00:23:37.010 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:37.010 "strip_size_kb": 0, 00:23:37.010 "state": "online", 00:23:37.010 "raid_level": "raid1", 00:23:37.010 "superblock": true, 00:23:37.010 "num_base_bdevs": 2, 00:23:37.010 "num_base_bdevs_discovered": 2, 00:23:37.010 "num_base_bdevs_operational": 2, 00:23:37.010 "process": { 00:23:37.010 "type": "rebuild", 00:23:37.010 "target": "spare", 00:23:37.010 "progress": { 00:23:37.010 "blocks": 40960, 00:23:37.010 "percent": 64 00:23:37.010 } 00:23:37.010 }, 00:23:37.010 "base_bdevs_list": [ 00:23:37.010 { 00:23:37.010 "name": "spare", 00:23:37.010 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:37.010 "is_configured": true, 00:23:37.010 "data_offset": 2048, 00:23:37.010 "data_size": 63488 00:23:37.010 }, 00:23:37.010 { 00:23:37.010 "name": "BaseBdev2", 00:23:37.010 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:37.010 "is_configured": true, 00:23:37.010 "data_offset": 2048, 00:23:37.010 "data_size": 63488 00:23:37.010 } 00:23:37.010 ] 00:23:37.010 }' 00:23:37.010 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.010 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.010 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.267 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.267 13:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:37.525 [2024-07-15 13:42:16.819882] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:37.525 [2024-07-15 13:42:16.921602] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:37.782 [2024-07-15 13:42:17.150809] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.040 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.298 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.298 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.298 [2024-07-15 13:42:17.599681] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.298 [2024-07-15 13:42:17.699964] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.298 [2024-07-15 13:42:17.701570] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:38.298 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.298 "name": "raid_bdev1", 00:23:38.298 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:38.298 "strip_size_kb": 0, 00:23:38.298 "state": "online", 00:23:38.298 "raid_level": "raid1", 00:23:38.298 "superblock": true, 00:23:38.298 "num_base_bdevs": 2, 00:23:38.298 "num_base_bdevs_discovered": 2, 00:23:38.298 "num_base_bdevs_operational": 2, 00:23:38.298 "process": { 00:23:38.298 "type": "rebuild", 00:23:38.298 "target": "spare", 00:23:38.298 "progress": { 00:23:38.298 "blocks": 63488, 00:23:38.298 "percent": 100 00:23:38.298 } 00:23:38.298 }, 00:23:38.298 "base_bdevs_list": [ 00:23:38.298 { 00:23:38.298 "name": "spare", 00:23:38.298 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:38.298 "is_configured": true, 00:23:38.298 "data_offset": 2048, 00:23:38.298 "data_size": 63488 00:23:38.298 }, 00:23:38.298 { 00:23:38.298 "name": "BaseBdev2", 00:23:38.298 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:38.298 "is_configured": true, 00:23:38.298 "data_offset": 2048, 00:23:38.298 "data_size": 63488 00:23:38.298 } 00:23:38.298 ] 00:23:38.298 }' 00:23:38.555 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.555 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:38.555 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.555 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:38.555 13:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.518 13:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.783 "name": "raid_bdev1", 00:23:39.783 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:39.783 "strip_size_kb": 0, 00:23:39.783 "state": "online", 00:23:39.783 "raid_level": "raid1", 00:23:39.783 "superblock": true, 00:23:39.783 "num_base_bdevs": 2, 00:23:39.783 "num_base_bdevs_discovered": 2, 00:23:39.783 "num_base_bdevs_operational": 2, 00:23:39.783 "base_bdevs_list": [ 00:23:39.783 { 00:23:39.783 "name": "spare", 00:23:39.783 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:39.783 "is_configured": true, 00:23:39.783 "data_offset": 2048, 00:23:39.783 "data_size": 63488 00:23:39.783 }, 00:23:39.783 { 00:23:39.783 "name": "BaseBdev2", 00:23:39.783 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:39.783 "is_configured": true, 00:23:39.783 "data_offset": 2048, 00:23:39.783 "data_size": 63488 00:23:39.783 } 00:23:39.783 ] 00:23:39.783 }' 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.783 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.041 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:40.041 "name": "raid_bdev1", 00:23:40.041 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:40.041 "strip_size_kb": 0, 00:23:40.041 "state": "online", 00:23:40.041 "raid_level": "raid1", 00:23:40.041 "superblock": true, 00:23:40.041 "num_base_bdevs": 2, 00:23:40.041 "num_base_bdevs_discovered": 2, 00:23:40.041 "num_base_bdevs_operational": 2, 00:23:40.041 "base_bdevs_list": [ 00:23:40.041 { 00:23:40.041 "name": "spare", 00:23:40.041 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:40.041 "is_configured": true, 00:23:40.041 "data_offset": 2048, 00:23:40.041 "data_size": 63488 00:23:40.041 }, 00:23:40.041 { 00:23:40.041 "name": "BaseBdev2", 00:23:40.041 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:40.041 "is_configured": true, 00:23:40.041 "data_offset": 2048, 00:23:40.041 "data_size": 63488 00:23:40.041 } 00:23:40.041 ] 00:23:40.041 }' 00:23:40.041 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.041 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:40.041 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.299 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.556 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.556 "name": "raid_bdev1", 00:23:40.556 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:40.556 "strip_size_kb": 0, 00:23:40.556 "state": "online", 00:23:40.556 "raid_level": "raid1", 00:23:40.556 "superblock": true, 00:23:40.556 "num_base_bdevs": 2, 00:23:40.556 "num_base_bdevs_discovered": 2, 00:23:40.556 "num_base_bdevs_operational": 2, 00:23:40.556 "base_bdevs_list": [ 00:23:40.556 { 00:23:40.556 "name": "spare", 00:23:40.556 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:40.556 "is_configured": true, 00:23:40.556 "data_offset": 2048, 00:23:40.556 "data_size": 63488 00:23:40.556 }, 00:23:40.556 { 00:23:40.556 "name": "BaseBdev2", 00:23:40.556 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:40.556 "is_configured": true, 00:23:40.556 "data_offset": 2048, 00:23:40.556 "data_size": 63488 00:23:40.556 } 00:23:40.556 ] 00:23:40.556 }' 00:23:40.556 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.556 13:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:41.120 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:41.378 [2024-07-15 13:42:20.578842] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:41.378 [2024-07-15 13:42:20.578873] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:41.378 00:23:41.378 Latency(us) 00:23:41.378 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:41.378 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:41.378 raid_bdev1 : 12.25 91.62 274.86 0.00 0.00 14769.49 283.16 116711.07 00:23:41.378 =================================================================================================================== 00:23:41.378 Total : 91.62 274.86 0.00 0.00 14769.49 283.16 116711.07 00:23:41.378 [2024-07-15 13:42:20.679038] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:41.378 [2024-07-15 13:42:20.679066] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:41.378 [2024-07-15 13:42:20.679139] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:41.378 [2024-07-15 13:42:20.679151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d8e070 name raid_bdev1, state offline 00:23:41.378 0 00:23:41.378 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:41.378 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.636 13:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:41.894 /dev/nbd0 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.894 1+0 records in 00:23:41.894 1+0 records out 00:23:41.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256104 s, 16.0 MB/s 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.894 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:42.152 /dev/nbd1 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:42.152 1+0 records in 00:23:42.152 1+0 records out 00:23:42.152 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252166 s, 16.2 MB/s 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:42.152 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.408 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.665 13:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:42.923 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:43.181 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.439 [2024-07-15 13:42:22.638378] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.439 [2024-07-15 13:42:22.638428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.439 [2024-07-15 13:42:22.638450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bdd490 00:23:43.439 [2024-07-15 13:42:22.638463] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.439 [2024-07-15 13:42:22.640122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.439 [2024-07-15 13:42:22.640152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.439 [2024-07-15 13:42:22.640239] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:43.439 [2024-07-15 13:42:22.640267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.439 [2024-07-15 13:42:22.640369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.439 spare 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.439 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.439 [2024-07-15 13:42:22.740683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bdcf70 00:23:43.439 [2024-07-15 13:42:22.740702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:43.439 [2024-07-15 13:42:22.740895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d86f50 00:23:43.439 [2024-07-15 13:42:22.741052] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bdcf70 00:23:43.439 [2024-07-15 13:42:22.741062] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bdcf70 00:23:43.439 [2024-07-15 13:42:22.741171] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.718 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.718 "name": "raid_bdev1", 00:23:43.718 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:43.718 "strip_size_kb": 0, 00:23:43.718 "state": "online", 00:23:43.718 "raid_level": "raid1", 00:23:43.718 "superblock": true, 00:23:43.718 "num_base_bdevs": 2, 00:23:43.718 "num_base_bdevs_discovered": 2, 00:23:43.718 "num_base_bdevs_operational": 2, 00:23:43.718 "base_bdevs_list": [ 00:23:43.718 { 00:23:43.718 "name": "spare", 00:23:43.718 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:43.718 "is_configured": true, 00:23:43.718 "data_offset": 2048, 00:23:43.718 "data_size": 63488 00:23:43.718 }, 00:23:43.718 { 00:23:43.718 "name": "BaseBdev2", 00:23:43.718 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:43.718 "is_configured": true, 00:23:43.718 "data_offset": 2048, 00:23:43.718 "data_size": 63488 00:23:43.718 } 00:23:43.718 ] 00:23:43.718 }' 00:23:43.718 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.718 13:42:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.282 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:44.540 "name": "raid_bdev1", 00:23:44.540 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:44.540 "strip_size_kb": 0, 00:23:44.540 "state": "online", 00:23:44.540 "raid_level": "raid1", 00:23:44.540 "superblock": true, 00:23:44.540 "num_base_bdevs": 2, 00:23:44.540 "num_base_bdevs_discovered": 2, 00:23:44.540 "num_base_bdevs_operational": 2, 00:23:44.540 "base_bdevs_list": [ 00:23:44.540 { 00:23:44.540 "name": "spare", 00:23:44.540 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:44.540 "is_configured": true, 00:23:44.540 "data_offset": 2048, 00:23:44.540 "data_size": 63488 00:23:44.540 }, 00:23:44.540 { 00:23:44.540 "name": "BaseBdev2", 00:23:44.540 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:44.540 "is_configured": true, 00:23:44.540 "data_offset": 2048, 00:23:44.540 "data_size": 63488 00:23:44.540 } 00:23:44.540 ] 00:23:44.540 }' 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.540 13:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:44.798 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:44.798 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:45.055 [2024-07-15 13:42:24.226904] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.055 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.312 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.312 "name": "raid_bdev1", 00:23:45.312 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:45.312 "strip_size_kb": 0, 00:23:45.312 "state": "online", 00:23:45.312 "raid_level": "raid1", 00:23:45.312 "superblock": true, 00:23:45.312 "num_base_bdevs": 2, 00:23:45.312 "num_base_bdevs_discovered": 1, 00:23:45.312 "num_base_bdevs_operational": 1, 00:23:45.312 "base_bdevs_list": [ 00:23:45.312 { 00:23:45.312 "name": null, 00:23:45.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.312 "is_configured": false, 00:23:45.312 "data_offset": 2048, 00:23:45.312 "data_size": 63488 00:23:45.312 }, 00:23:45.312 { 00:23:45.312 "name": "BaseBdev2", 00:23:45.312 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:45.312 "is_configured": true, 00:23:45.312 "data_offset": 2048, 00:23:45.312 "data_size": 63488 00:23:45.312 } 00:23:45.312 ] 00:23:45.312 }' 00:23:45.312 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.312 13:42:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:45.877 13:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:45.877 [2024-07-15 13:42:25.233748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:45.877 [2024-07-15 13:42:25.233902] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:45.877 [2024-07-15 13:42:25.233921] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:45.877 [2024-07-15 13:42:25.233957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:45.877 [2024-07-15 13:42:25.239186] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d87490 00:23:45.877 [2024-07-15 13:42:25.241504] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:45.877 13:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.250 "name": "raid_bdev1", 00:23:47.250 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:47.250 "strip_size_kb": 0, 00:23:47.250 "state": "online", 00:23:47.250 "raid_level": "raid1", 00:23:47.250 "superblock": true, 00:23:47.250 "num_base_bdevs": 2, 00:23:47.250 "num_base_bdevs_discovered": 2, 00:23:47.250 "num_base_bdevs_operational": 2, 00:23:47.250 "process": { 00:23:47.250 "type": "rebuild", 00:23:47.250 "target": "spare", 00:23:47.250 "progress": { 00:23:47.250 "blocks": 24576, 00:23:47.250 "percent": 38 00:23:47.250 } 00:23:47.250 }, 00:23:47.250 "base_bdevs_list": [ 00:23:47.250 { 00:23:47.250 "name": "spare", 00:23:47.250 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:47.250 "is_configured": true, 00:23:47.250 "data_offset": 2048, 00:23:47.250 "data_size": 63488 00:23:47.250 }, 00:23:47.250 { 00:23:47.250 "name": "BaseBdev2", 00:23:47.250 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:47.250 "is_configured": true, 00:23:47.250 "data_offset": 2048, 00:23:47.250 "data_size": 63488 00:23:47.250 } 00:23:47.250 ] 00:23:47.250 }' 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.250 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:47.507 [2024-07-15 13:42:26.841001] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.507 [2024-07-15 13:42:26.854352] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:47.507 [2024-07-15 13:42:26.854398] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:47.508 [2024-07-15 13:42:26.854413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.508 [2024-07-15 13:42:26.854422] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.508 13:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.765 13:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.765 "name": "raid_bdev1", 00:23:47.765 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:47.765 "strip_size_kb": 0, 00:23:47.765 "state": "online", 00:23:47.765 "raid_level": "raid1", 00:23:47.765 "superblock": true, 00:23:47.765 "num_base_bdevs": 2, 00:23:47.765 "num_base_bdevs_discovered": 1, 00:23:47.765 "num_base_bdevs_operational": 1, 00:23:47.765 "base_bdevs_list": [ 00:23:47.765 { 00:23:47.765 "name": null, 00:23:47.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.765 "is_configured": false, 00:23:47.765 "data_offset": 2048, 00:23:47.765 "data_size": 63488 00:23:47.765 }, 00:23:47.765 { 00:23:47.765 "name": "BaseBdev2", 00:23:47.765 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:47.765 "is_configured": true, 00:23:47.765 "data_offset": 2048, 00:23:47.765 "data_size": 63488 00:23:47.765 } 00:23:47.765 ] 00:23:47.765 }' 00:23:47.765 13:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.765 13:42:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:48.357 13:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:48.614 [2024-07-15 13:42:27.958133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:48.614 [2024-07-15 13:42:27.958187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.614 [2024-07-15 13:42:27.958209] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bf0d20 00:23:48.614 [2024-07-15 13:42:27.958222] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.614 [2024-07-15 13:42:27.958585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.614 [2024-07-15 13:42:27.958603] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:48.614 [2024-07-15 13:42:27.958683] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:48.614 [2024-07-15 13:42:27.958696] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:48.614 [2024-07-15 13:42:27.958708] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:48.615 [2024-07-15 13:42:27.958727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:48.615 [2024-07-15 13:42:27.964069] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d87490 00:23:48.615 spare 00:23:48.615 [2024-07-15 13:42:27.965519] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:48.615 13:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.988 13:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.988 "name": "raid_bdev1", 00:23:49.988 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:49.988 "strip_size_kb": 0, 00:23:49.988 "state": "online", 00:23:49.988 "raid_level": "raid1", 00:23:49.988 "superblock": true, 00:23:49.988 "num_base_bdevs": 2, 00:23:49.988 "num_base_bdevs_discovered": 2, 00:23:49.988 "num_base_bdevs_operational": 2, 00:23:49.988 "process": { 00:23:49.988 "type": "rebuild", 00:23:49.988 "target": "spare", 00:23:49.988 "progress": { 00:23:49.988 "blocks": 24576, 00:23:49.988 "percent": 38 00:23:49.988 } 00:23:49.988 }, 00:23:49.988 "base_bdevs_list": [ 00:23:49.988 { 00:23:49.988 "name": "spare", 00:23:49.988 "uuid": "9e67a4ee-9a31-5e4c-8678-05845578d981", 00:23:49.988 "is_configured": true, 00:23:49.988 "data_offset": 2048, 00:23:49.988 "data_size": 63488 00:23:49.988 }, 00:23:49.988 { 00:23:49.988 "name": "BaseBdev2", 00:23:49.988 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:49.988 "is_configured": true, 00:23:49.988 "data_offset": 2048, 00:23:49.988 "data_size": 63488 00:23:49.988 } 00:23:49.988 ] 00:23:49.988 }' 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:49.988 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:50.246 [2024-07-15 13:42:29.476682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.246 [2024-07-15 13:42:29.477503] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:50.246 [2024-07-15 13:42:29.477549] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.246 [2024-07-15 13:42:29.477565] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.246 [2024-07-15 13:42:29.477573] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.246 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.503 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.503 "name": "raid_bdev1", 00:23:50.503 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:50.503 "strip_size_kb": 0, 00:23:50.503 "state": "online", 00:23:50.503 "raid_level": "raid1", 00:23:50.503 "superblock": true, 00:23:50.503 "num_base_bdevs": 2, 00:23:50.503 "num_base_bdevs_discovered": 1, 00:23:50.503 "num_base_bdevs_operational": 1, 00:23:50.503 "base_bdevs_list": [ 00:23:50.503 { 00:23:50.503 "name": null, 00:23:50.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.503 "is_configured": false, 00:23:50.503 "data_offset": 2048, 00:23:50.503 "data_size": 63488 00:23:50.503 }, 00:23:50.503 { 00:23:50.503 "name": "BaseBdev2", 00:23:50.503 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:50.503 "is_configured": true, 00:23:50.503 "data_offset": 2048, 00:23:50.503 "data_size": 63488 00:23:50.503 } 00:23:50.503 ] 00:23:50.503 }' 00:23:50.503 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.503 13:42:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.069 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.326 "name": "raid_bdev1", 00:23:51.326 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:51.326 "strip_size_kb": 0, 00:23:51.326 "state": "online", 00:23:51.326 "raid_level": "raid1", 00:23:51.326 "superblock": true, 00:23:51.326 "num_base_bdevs": 2, 00:23:51.326 "num_base_bdevs_discovered": 1, 00:23:51.326 "num_base_bdevs_operational": 1, 00:23:51.326 "base_bdevs_list": [ 00:23:51.326 { 00:23:51.326 "name": null, 00:23:51.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.326 "is_configured": false, 00:23:51.326 "data_offset": 2048, 00:23:51.326 "data_size": 63488 00:23:51.326 }, 00:23:51.326 { 00:23:51.326 "name": "BaseBdev2", 00:23:51.326 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:51.326 "is_configured": true, 00:23:51.326 "data_offset": 2048, 00:23:51.326 "data_size": 63488 00:23:51.326 } 00:23:51.326 ] 00:23:51.326 }' 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:51.326 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:51.584 13:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:51.584 [2024-07-15 13:42:31.006420] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:51.584 [2024-07-15 13:42:31.006471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.584 [2024-07-15 13:42:31.006494] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d8f140 00:23:51.584 [2024-07-15 13:42:31.006507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.584 [2024-07-15 13:42:31.006839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.584 [2024-07-15 13:42:31.006856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:51.584 [2024-07-15 13:42:31.006921] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:51.584 [2024-07-15 13:42:31.006946] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:51.584 [2024-07-15 13:42:31.006961] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:51.841 BaseBdev1 00:23:51.841 13:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.774 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.032 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.032 "name": "raid_bdev1", 00:23:53.032 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:53.032 "strip_size_kb": 0, 00:23:53.032 "state": "online", 00:23:53.032 "raid_level": "raid1", 00:23:53.032 "superblock": true, 00:23:53.032 "num_base_bdevs": 2, 00:23:53.032 "num_base_bdevs_discovered": 1, 00:23:53.032 "num_base_bdevs_operational": 1, 00:23:53.032 "base_bdevs_list": [ 00:23:53.032 { 00:23:53.032 "name": null, 00:23:53.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.091 "is_configured": false, 00:23:53.091 "data_offset": 2048, 00:23:53.091 "data_size": 63488 00:23:53.091 }, 00:23:53.091 { 00:23:53.091 "name": "BaseBdev2", 00:23:53.091 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:53.091 "is_configured": true, 00:23:53.091 "data_offset": 2048, 00:23:53.091 "data_size": 63488 00:23:53.091 } 00:23:53.091 ] 00:23:53.091 }' 00:23:53.091 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.091 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.655 13:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.913 "name": "raid_bdev1", 00:23:53.913 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:53.913 "strip_size_kb": 0, 00:23:53.913 "state": "online", 00:23:53.913 "raid_level": "raid1", 00:23:53.913 "superblock": true, 00:23:53.913 "num_base_bdevs": 2, 00:23:53.913 "num_base_bdevs_discovered": 1, 00:23:53.913 "num_base_bdevs_operational": 1, 00:23:53.913 "base_bdevs_list": [ 00:23:53.913 { 00:23:53.913 "name": null, 00:23:53.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.913 "is_configured": false, 00:23:53.913 "data_offset": 2048, 00:23:53.913 "data_size": 63488 00:23:53.913 }, 00:23:53.913 { 00:23:53.913 "name": "BaseBdev2", 00:23:53.913 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:53.913 "is_configured": true, 00:23:53.913 "data_offset": 2048, 00:23:53.913 "data_size": 63488 00:23:53.913 } 00:23:53.913 ] 00:23:53.913 }' 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:53.913 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.170 [2024-07-15 13:42:33.357018] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.170 [2024-07-15 13:42:33.357142] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:54.170 [2024-07-15 13:42:33.357158] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:54.170 request: 00:23:54.170 { 00:23:54.170 "base_bdev": "BaseBdev1", 00:23:54.170 "raid_bdev": "raid_bdev1", 00:23:54.170 "method": "bdev_raid_add_base_bdev", 00:23:54.170 "req_id": 1 00:23:54.170 } 00:23:54.170 Got JSON-RPC error response 00:23:54.170 response: 00:23:54.170 { 00:23:54.170 "code": -22, 00:23:54.170 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:54.170 } 00:23:54.170 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:23:54.170 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:54.170 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:54.170 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:54.170 13:42:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.102 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.361 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.361 "name": "raid_bdev1", 00:23:55.361 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:55.361 "strip_size_kb": 0, 00:23:55.361 "state": "online", 00:23:55.361 "raid_level": "raid1", 00:23:55.361 "superblock": true, 00:23:55.361 "num_base_bdevs": 2, 00:23:55.361 "num_base_bdevs_discovered": 1, 00:23:55.361 "num_base_bdevs_operational": 1, 00:23:55.361 "base_bdevs_list": [ 00:23:55.361 { 00:23:55.361 "name": null, 00:23:55.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.361 "is_configured": false, 00:23:55.361 "data_offset": 2048, 00:23:55.361 "data_size": 63488 00:23:55.361 }, 00:23:55.361 { 00:23:55.361 "name": "BaseBdev2", 00:23:55.361 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:55.361 "is_configured": true, 00:23:55.361 "data_offset": 2048, 00:23:55.361 "data_size": 63488 00:23:55.361 } 00:23:55.361 ] 00:23:55.361 }' 00:23:55.361 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.361 13:42:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.927 "name": "raid_bdev1", 00:23:55.927 "uuid": "cb038349-3f7a-4de9-a2d7-70c377aa564d", 00:23:55.927 "strip_size_kb": 0, 00:23:55.927 "state": "online", 00:23:55.927 "raid_level": "raid1", 00:23:55.927 "superblock": true, 00:23:55.927 "num_base_bdevs": 2, 00:23:55.927 "num_base_bdevs_discovered": 1, 00:23:55.927 "num_base_bdevs_operational": 1, 00:23:55.927 "base_bdevs_list": [ 00:23:55.927 { 00:23:55.927 "name": null, 00:23:55.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.927 "is_configured": false, 00:23:55.927 "data_offset": 2048, 00:23:55.927 "data_size": 63488 00:23:55.927 }, 00:23:55.927 { 00:23:55.927 "name": "BaseBdev2", 00:23:55.927 "uuid": "07f5bbd3-c965-57e6-8c8f-069381fdd7ab", 00:23:55.927 "is_configured": true, 00:23:55.927 "data_offset": 2048, 00:23:55.927 "data_size": 63488 00:23:55.927 } 00:23:55.927 ] 00:23:55.927 }' 00:23:55.927 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2184860 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2184860 ']' 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2184860 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2184860 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2184860' 00:23:56.185 killing process with pid 2184860 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2184860 00:23:56.185 Received shutdown signal, test time was about 27.012839 seconds 00:23:56.185 00:23:56.185 Latency(us) 00:23:56.185 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.185 =================================================================================================================== 00:23:56.185 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:56.185 [2024-07-15 13:42:35.477746] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:56.185 [2024-07-15 13:42:35.477841] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.185 [2024-07-15 13:42:35.477885] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.185 [2024-07-15 13:42:35.477897] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bdcf70 name raid_bdev1, state offline 00:23:56.185 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2184860 00:23:56.185 [2024-07-15 13:42:35.499053] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:56.443 00:23:56.443 real 0m31.637s 00:23:56.443 user 0m49.478s 00:23:56.443 sys 0m4.483s 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:56.443 ************************************ 00:23:56.443 END TEST raid_rebuild_test_sb_io 00:23:56.443 ************************************ 00:23:56.443 13:42:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:56.443 13:42:35 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:56.443 13:42:35 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:23:56.443 13:42:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:56.443 13:42:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:56.443 13:42:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:56.443 ************************************ 00:23:56.443 START TEST raid_rebuild_test 00:23:56.443 ************************************ 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2189730 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2189730 /var/tmp/spdk-raid.sock 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2189730 ']' 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:56.443 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:56.443 13:42:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:56.443 [2024-07-15 13:42:35.853725] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:23:56.443 [2024-07-15 13:42:35.853787] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2189730 ] 00:23:56.443 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:56.443 Zero copy mechanism will not be used. 00:23:56.729 [2024-07-15 13:42:35.971355] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:56.729 [2024-07-15 13:42:36.077274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:56.996 [2024-07-15 13:42:36.148353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:56.996 [2024-07-15 13:42:36.148390] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:57.561 13:42:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:57.561 13:42:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:57.561 13:42:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:57.561 13:42:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:57.818 BaseBdev1_malloc 00:23:57.818 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:57.818 [2024-07-15 13:42:37.233635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:57.818 [2024-07-15 13:42:37.233683] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:57.818 [2024-07-15 13:42:37.233707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111ed40 00:23:57.818 [2024-07-15 13:42:37.233720] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:57.818 [2024-07-15 13:42:37.235539] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:57.818 [2024-07-15 13:42:37.235574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:57.818 BaseBdev1 00:23:58.076 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:58.076 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:58.076 BaseBdev2_malloc 00:23:58.076 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:58.333 [2024-07-15 13:42:37.719826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:58.333 [2024-07-15 13:42:37.719871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:58.333 [2024-07-15 13:42:37.719896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111f860 00:23:58.333 [2024-07-15 13:42:37.719909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:58.333 [2024-07-15 13:42:37.721360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:58.333 [2024-07-15 13:42:37.721388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:58.333 BaseBdev2 00:23:58.333 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:58.333 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:58.590 BaseBdev3_malloc 00:23:58.590 13:42:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:58.848 [2024-07-15 13:42:38.145579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:58.848 [2024-07-15 13:42:38.145631] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:58.848 [2024-07-15 13:42:38.145654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cc8f0 00:23:58.848 [2024-07-15 13:42:38.145667] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:58.848 [2024-07-15 13:42:38.147272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:58.848 [2024-07-15 13:42:38.147301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:58.848 BaseBdev3 00:23:58.848 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:58.848 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:59.106 BaseBdev4_malloc 00:23:59.106 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:59.364 [2024-07-15 13:42:38.559226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:59.365 [2024-07-15 13:42:38.559275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.365 [2024-07-15 13:42:38.559299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cbad0 00:23:59.365 [2024-07-15 13:42:38.559312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.365 [2024-07-15 13:42:38.560885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.365 [2024-07-15 13:42:38.560913] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:59.365 BaseBdev4 00:23:59.365 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:59.622 spare_malloc 00:23:59.622 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:59.622 spare_delay 00:23:59.622 13:42:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:59.880 [2024-07-15 13:42:39.190478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:59.880 [2024-07-15 13:42:39.190528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.880 [2024-07-15 13:42:39.190549] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12d05b0 00:23:59.880 [2024-07-15 13:42:39.190562] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.880 [2024-07-15 13:42:39.192167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.880 [2024-07-15 13:42:39.192210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:59.880 spare 00:23:59.880 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:00.137 [2024-07-15 13:42:39.435166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:00.137 [2024-07-15 13:42:39.436480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:00.137 [2024-07-15 13:42:39.436536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:00.137 [2024-07-15 13:42:39.436581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:00.137 [2024-07-15 13:42:39.436667] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x124f8a0 00:24:00.137 [2024-07-15 13:42:39.436677] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:00.137 [2024-07-15 13:42:39.436895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c9e10 00:24:00.137 [2024-07-15 13:42:39.437057] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x124f8a0 00:24:00.137 [2024-07-15 13:42:39.437068] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x124f8a0 00:24:00.137 [2024-07-15 13:42:39.437189] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.137 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.138 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.138 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.138 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.395 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.395 "name": "raid_bdev1", 00:24:00.395 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:00.395 "strip_size_kb": 0, 00:24:00.395 "state": "online", 00:24:00.395 "raid_level": "raid1", 00:24:00.395 "superblock": false, 00:24:00.395 "num_base_bdevs": 4, 00:24:00.395 "num_base_bdevs_discovered": 4, 00:24:00.395 "num_base_bdevs_operational": 4, 00:24:00.395 "base_bdevs_list": [ 00:24:00.395 { 00:24:00.395 "name": "BaseBdev1", 00:24:00.395 "uuid": "aef40834-e643-566c-90c2-b8bf8de9c030", 00:24:00.395 "is_configured": true, 00:24:00.395 "data_offset": 0, 00:24:00.395 "data_size": 65536 00:24:00.395 }, 00:24:00.395 { 00:24:00.395 "name": "BaseBdev2", 00:24:00.395 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:00.395 "is_configured": true, 00:24:00.395 "data_offset": 0, 00:24:00.395 "data_size": 65536 00:24:00.395 }, 00:24:00.395 { 00:24:00.395 "name": "BaseBdev3", 00:24:00.395 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:00.395 "is_configured": true, 00:24:00.395 "data_offset": 0, 00:24:00.395 "data_size": 65536 00:24:00.395 }, 00:24:00.395 { 00:24:00.395 "name": "BaseBdev4", 00:24:00.395 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:00.395 "is_configured": true, 00:24:00.395 "data_offset": 0, 00:24:00.395 "data_size": 65536 00:24:00.395 } 00:24:00.395 ] 00:24:00.395 }' 00:24:00.395 13:42:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.395 13:42:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:00.960 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:00.960 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:01.218 [2024-07-15 13:42:40.506303] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:01.218 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:01.218 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.218 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:01.475 13:42:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:01.733 [2024-07-15 13:42:40.999345] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c9e10 00:24:01.733 /dev/nbd0 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:01.733 1+0 records in 00:24:01.733 1+0 records out 00:24:01.733 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224219 s, 18.3 MB/s 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:01.733 13:42:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:09.834 65536+0 records in 00:24:09.834 65536+0 records out 00:24:09.834 33554432 bytes (34 MB, 32 MiB) copied, 7.0293 s, 4.8 MB/s 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:09.834 [2024-07-15 13:42:48.363305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:09.834 [2024-07-15 13:42:48.599657] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.834 "name": "raid_bdev1", 00:24:09.834 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:09.834 "strip_size_kb": 0, 00:24:09.834 "state": "online", 00:24:09.834 "raid_level": "raid1", 00:24:09.834 "superblock": false, 00:24:09.834 "num_base_bdevs": 4, 00:24:09.834 "num_base_bdevs_discovered": 3, 00:24:09.834 "num_base_bdevs_operational": 3, 00:24:09.834 "base_bdevs_list": [ 00:24:09.834 { 00:24:09.834 "name": null, 00:24:09.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.834 "is_configured": false, 00:24:09.834 "data_offset": 0, 00:24:09.834 "data_size": 65536 00:24:09.834 }, 00:24:09.834 { 00:24:09.834 "name": "BaseBdev2", 00:24:09.834 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:09.834 "is_configured": true, 00:24:09.834 "data_offset": 0, 00:24:09.834 "data_size": 65536 00:24:09.834 }, 00:24:09.834 { 00:24:09.834 "name": "BaseBdev3", 00:24:09.834 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:09.834 "is_configured": true, 00:24:09.834 "data_offset": 0, 00:24:09.834 "data_size": 65536 00:24:09.834 }, 00:24:09.834 { 00:24:09.834 "name": "BaseBdev4", 00:24:09.834 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:09.834 "is_configured": true, 00:24:09.834 "data_offset": 0, 00:24:09.834 "data_size": 65536 00:24:09.834 } 00:24:09.834 ] 00:24:09.834 }' 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.834 13:42:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:10.091 13:42:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:10.350 [2024-07-15 13:42:49.686546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.350 [2024-07-15 13:42:49.690641] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12556b0 00:24:10.350 [2024-07-15 13:42:49.693048] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:10.350 13:42:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.720 "name": "raid_bdev1", 00:24:11.720 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:11.720 "strip_size_kb": 0, 00:24:11.720 "state": "online", 00:24:11.720 "raid_level": "raid1", 00:24:11.720 "superblock": false, 00:24:11.720 "num_base_bdevs": 4, 00:24:11.720 "num_base_bdevs_discovered": 4, 00:24:11.720 "num_base_bdevs_operational": 4, 00:24:11.720 "process": { 00:24:11.720 "type": "rebuild", 00:24:11.720 "target": "spare", 00:24:11.720 "progress": { 00:24:11.720 "blocks": 24576, 00:24:11.720 "percent": 37 00:24:11.720 } 00:24:11.720 }, 00:24:11.720 "base_bdevs_list": [ 00:24:11.720 { 00:24:11.720 "name": "spare", 00:24:11.720 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:11.720 "is_configured": true, 00:24:11.720 "data_offset": 0, 00:24:11.720 "data_size": 65536 00:24:11.720 }, 00:24:11.720 { 00:24:11.720 "name": "BaseBdev2", 00:24:11.720 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:11.720 "is_configured": true, 00:24:11.720 "data_offset": 0, 00:24:11.720 "data_size": 65536 00:24:11.720 }, 00:24:11.720 { 00:24:11.720 "name": "BaseBdev3", 00:24:11.720 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:11.720 "is_configured": true, 00:24:11.720 "data_offset": 0, 00:24:11.720 "data_size": 65536 00:24:11.720 }, 00:24:11.720 { 00:24:11.720 "name": "BaseBdev4", 00:24:11.720 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:11.720 "is_configured": true, 00:24:11.720 "data_offset": 0, 00:24:11.720 "data_size": 65536 00:24:11.720 } 00:24:11.720 ] 00:24:11.720 }' 00:24:11.720 13:42:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.720 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.720 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.720 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.720 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:11.978 [2024-07-15 13:42:51.280395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.978 [2024-07-15 13:42:51.305456] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:11.978 [2024-07-15 13:42:51.305501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.978 [2024-07-15 13:42:51.305520] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.978 [2024-07-15 13:42:51.305528] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.978 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.237 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.237 "name": "raid_bdev1", 00:24:12.237 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:12.237 "strip_size_kb": 0, 00:24:12.237 "state": "online", 00:24:12.237 "raid_level": "raid1", 00:24:12.237 "superblock": false, 00:24:12.237 "num_base_bdevs": 4, 00:24:12.237 "num_base_bdevs_discovered": 3, 00:24:12.237 "num_base_bdevs_operational": 3, 00:24:12.237 "base_bdevs_list": [ 00:24:12.237 { 00:24:12.237 "name": null, 00:24:12.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.237 "is_configured": false, 00:24:12.237 "data_offset": 0, 00:24:12.237 "data_size": 65536 00:24:12.237 }, 00:24:12.237 { 00:24:12.237 "name": "BaseBdev2", 00:24:12.237 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:12.237 "is_configured": true, 00:24:12.237 "data_offset": 0, 00:24:12.237 "data_size": 65536 00:24:12.237 }, 00:24:12.237 { 00:24:12.237 "name": "BaseBdev3", 00:24:12.237 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:12.237 "is_configured": true, 00:24:12.237 "data_offset": 0, 00:24:12.237 "data_size": 65536 00:24:12.237 }, 00:24:12.237 { 00:24:12.237 "name": "BaseBdev4", 00:24:12.237 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:12.237 "is_configured": true, 00:24:12.237 "data_offset": 0, 00:24:12.237 "data_size": 65536 00:24:12.237 } 00:24:12.237 ] 00:24:12.237 }' 00:24:12.237 13:42:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.237 13:42:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.800 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.057 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.057 "name": "raid_bdev1", 00:24:13.057 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:13.057 "strip_size_kb": 0, 00:24:13.057 "state": "online", 00:24:13.058 "raid_level": "raid1", 00:24:13.058 "superblock": false, 00:24:13.058 "num_base_bdevs": 4, 00:24:13.058 "num_base_bdevs_discovered": 3, 00:24:13.058 "num_base_bdevs_operational": 3, 00:24:13.058 "base_bdevs_list": [ 00:24:13.058 { 00:24:13.058 "name": null, 00:24:13.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.058 "is_configured": false, 00:24:13.058 "data_offset": 0, 00:24:13.058 "data_size": 65536 00:24:13.058 }, 00:24:13.058 { 00:24:13.058 "name": "BaseBdev2", 00:24:13.058 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:13.058 "is_configured": true, 00:24:13.058 "data_offset": 0, 00:24:13.058 "data_size": 65536 00:24:13.058 }, 00:24:13.058 { 00:24:13.058 "name": "BaseBdev3", 00:24:13.058 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:13.058 "is_configured": true, 00:24:13.058 "data_offset": 0, 00:24:13.058 "data_size": 65536 00:24:13.058 }, 00:24:13.058 { 00:24:13.058 "name": "BaseBdev4", 00:24:13.058 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:13.058 "is_configured": true, 00:24:13.058 "data_offset": 0, 00:24:13.058 "data_size": 65536 00:24:13.058 } 00:24:13.058 ] 00:24:13.058 }' 00:24:13.058 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.058 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:13.058 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.058 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.058 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:13.316 [2024-07-15 13:42:52.569447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:13.316 [2024-07-15 13:42:52.573578] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12556b0 00:24:13.316 [2024-07-15 13:42:52.575092] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:13.316 13:42:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.249 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.507 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:14.507 "name": "raid_bdev1", 00:24:14.507 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:14.507 "strip_size_kb": 0, 00:24:14.507 "state": "online", 00:24:14.507 "raid_level": "raid1", 00:24:14.507 "superblock": false, 00:24:14.507 "num_base_bdevs": 4, 00:24:14.507 "num_base_bdevs_discovered": 4, 00:24:14.508 "num_base_bdevs_operational": 4, 00:24:14.508 "process": { 00:24:14.508 "type": "rebuild", 00:24:14.508 "target": "spare", 00:24:14.508 "progress": { 00:24:14.508 "blocks": 24576, 00:24:14.508 "percent": 37 00:24:14.508 } 00:24:14.508 }, 00:24:14.508 "base_bdevs_list": [ 00:24:14.508 { 00:24:14.508 "name": "spare", 00:24:14.508 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:14.508 "is_configured": true, 00:24:14.508 "data_offset": 0, 00:24:14.508 "data_size": 65536 00:24:14.508 }, 00:24:14.508 { 00:24:14.508 "name": "BaseBdev2", 00:24:14.508 "uuid": "5e952c78-1956-561a-96c6-cbee642e2176", 00:24:14.508 "is_configured": true, 00:24:14.508 "data_offset": 0, 00:24:14.508 "data_size": 65536 00:24:14.508 }, 00:24:14.508 { 00:24:14.508 "name": "BaseBdev3", 00:24:14.508 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:14.508 "is_configured": true, 00:24:14.508 "data_offset": 0, 00:24:14.508 "data_size": 65536 00:24:14.508 }, 00:24:14.508 { 00:24:14.508 "name": "BaseBdev4", 00:24:14.508 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:14.508 "is_configured": true, 00:24:14.508 "data_offset": 0, 00:24:14.508 "data_size": 65536 00:24:14.508 } 00:24:14.508 ] 00:24:14.508 }' 00:24:14.508 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:14.508 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:14.508 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:14.766 13:42:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:14.766 [2024-07-15 13:42:54.170818] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:14.766 [2024-07-15 13:42:54.187790] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x12556b0 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.024 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.282 "name": "raid_bdev1", 00:24:15.282 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:15.282 "strip_size_kb": 0, 00:24:15.282 "state": "online", 00:24:15.282 "raid_level": "raid1", 00:24:15.282 "superblock": false, 00:24:15.282 "num_base_bdevs": 4, 00:24:15.282 "num_base_bdevs_discovered": 3, 00:24:15.282 "num_base_bdevs_operational": 3, 00:24:15.282 "process": { 00:24:15.282 "type": "rebuild", 00:24:15.282 "target": "spare", 00:24:15.282 "progress": { 00:24:15.282 "blocks": 36864, 00:24:15.282 "percent": 56 00:24:15.282 } 00:24:15.282 }, 00:24:15.282 "base_bdevs_list": [ 00:24:15.282 { 00:24:15.282 "name": "spare", 00:24:15.282 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:15.282 "is_configured": true, 00:24:15.282 "data_offset": 0, 00:24:15.282 "data_size": 65536 00:24:15.282 }, 00:24:15.282 { 00:24:15.282 "name": null, 00:24:15.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.282 "is_configured": false, 00:24:15.282 "data_offset": 0, 00:24:15.282 "data_size": 65536 00:24:15.282 }, 00:24:15.282 { 00:24:15.282 "name": "BaseBdev3", 00:24:15.282 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:15.282 "is_configured": true, 00:24:15.282 "data_offset": 0, 00:24:15.282 "data_size": 65536 00:24:15.282 }, 00:24:15.282 { 00:24:15.282 "name": "BaseBdev4", 00:24:15.282 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:15.282 "is_configured": true, 00:24:15.282 "data_offset": 0, 00:24:15.282 "data_size": 65536 00:24:15.282 } 00:24:15.282 ] 00:24:15.282 }' 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=878 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.282 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.540 "name": "raid_bdev1", 00:24:15.540 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:15.540 "strip_size_kb": 0, 00:24:15.540 "state": "online", 00:24:15.540 "raid_level": "raid1", 00:24:15.540 "superblock": false, 00:24:15.540 "num_base_bdevs": 4, 00:24:15.540 "num_base_bdevs_discovered": 3, 00:24:15.540 "num_base_bdevs_operational": 3, 00:24:15.540 "process": { 00:24:15.540 "type": "rebuild", 00:24:15.540 "target": "spare", 00:24:15.540 "progress": { 00:24:15.540 "blocks": 43008, 00:24:15.540 "percent": 65 00:24:15.540 } 00:24:15.540 }, 00:24:15.540 "base_bdevs_list": [ 00:24:15.540 { 00:24:15.540 "name": "spare", 00:24:15.540 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:15.540 "is_configured": true, 00:24:15.540 "data_offset": 0, 00:24:15.540 "data_size": 65536 00:24:15.540 }, 00:24:15.540 { 00:24:15.540 "name": null, 00:24:15.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.540 "is_configured": false, 00:24:15.540 "data_offset": 0, 00:24:15.540 "data_size": 65536 00:24:15.540 }, 00:24:15.540 { 00:24:15.540 "name": "BaseBdev3", 00:24:15.540 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:15.540 "is_configured": true, 00:24:15.540 "data_offset": 0, 00:24:15.540 "data_size": 65536 00:24:15.540 }, 00:24:15.540 { 00:24:15.540 "name": "BaseBdev4", 00:24:15.540 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:15.540 "is_configured": true, 00:24:15.540 "data_offset": 0, 00:24:15.540 "data_size": 65536 00:24:15.540 } 00:24:15.540 ] 00:24:15.540 }' 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.540 13:42:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:16.473 [2024-07-15 13:42:55.800349] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:16.473 [2024-07-15 13:42:55.800416] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:16.473 [2024-07-15 13:42:55.800456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.473 13:42:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.731 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.731 "name": "raid_bdev1", 00:24:16.731 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:16.731 "strip_size_kb": 0, 00:24:16.731 "state": "online", 00:24:16.731 "raid_level": "raid1", 00:24:16.731 "superblock": false, 00:24:16.731 "num_base_bdevs": 4, 00:24:16.731 "num_base_bdevs_discovered": 3, 00:24:16.731 "num_base_bdevs_operational": 3, 00:24:16.731 "base_bdevs_list": [ 00:24:16.731 { 00:24:16.731 "name": "spare", 00:24:16.731 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:16.731 "is_configured": true, 00:24:16.731 "data_offset": 0, 00:24:16.731 "data_size": 65536 00:24:16.731 }, 00:24:16.731 { 00:24:16.731 "name": null, 00:24:16.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.731 "is_configured": false, 00:24:16.731 "data_offset": 0, 00:24:16.731 "data_size": 65536 00:24:16.731 }, 00:24:16.731 { 00:24:16.731 "name": "BaseBdev3", 00:24:16.731 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:16.731 "is_configured": true, 00:24:16.731 "data_offset": 0, 00:24:16.731 "data_size": 65536 00:24:16.731 }, 00:24:16.731 { 00:24:16.731 "name": "BaseBdev4", 00:24:16.731 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:16.731 "is_configured": true, 00:24:16.731 "data_offset": 0, 00:24:16.731 "data_size": 65536 00:24:16.731 } 00:24:16.731 ] 00:24:16.731 }' 00:24:16.731 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.988 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:17.246 "name": "raid_bdev1", 00:24:17.246 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:17.246 "strip_size_kb": 0, 00:24:17.246 "state": "online", 00:24:17.246 "raid_level": "raid1", 00:24:17.246 "superblock": false, 00:24:17.246 "num_base_bdevs": 4, 00:24:17.246 "num_base_bdevs_discovered": 3, 00:24:17.246 "num_base_bdevs_operational": 3, 00:24:17.246 "base_bdevs_list": [ 00:24:17.246 { 00:24:17.246 "name": "spare", 00:24:17.246 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:17.246 "is_configured": true, 00:24:17.246 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 }, 00:24:17.246 { 00:24:17.246 "name": null, 00:24:17.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.246 "is_configured": false, 00:24:17.246 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 }, 00:24:17.246 { 00:24:17.246 "name": "BaseBdev3", 00:24:17.246 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:17.246 "is_configured": true, 00:24:17.246 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 }, 00:24:17.246 { 00:24:17.246 "name": "BaseBdev4", 00:24:17.246 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:17.246 "is_configured": true, 00:24:17.246 "data_offset": 0, 00:24:17.246 "data_size": 65536 00:24:17.246 } 00:24:17.246 ] 00:24:17.246 }' 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.246 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.505 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.505 "name": "raid_bdev1", 00:24:17.505 "uuid": "db1b6622-d46d-40b1-97ce-d18688097752", 00:24:17.505 "strip_size_kb": 0, 00:24:17.505 "state": "online", 00:24:17.505 "raid_level": "raid1", 00:24:17.505 "superblock": false, 00:24:17.505 "num_base_bdevs": 4, 00:24:17.505 "num_base_bdevs_discovered": 3, 00:24:17.505 "num_base_bdevs_operational": 3, 00:24:17.505 "base_bdevs_list": [ 00:24:17.505 { 00:24:17.505 "name": "spare", 00:24:17.505 "uuid": "596a5a62-5427-50e4-8158-861403c78eb0", 00:24:17.505 "is_configured": true, 00:24:17.505 "data_offset": 0, 00:24:17.505 "data_size": 65536 00:24:17.505 }, 00:24:17.505 { 00:24:17.505 "name": null, 00:24:17.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.505 "is_configured": false, 00:24:17.505 "data_offset": 0, 00:24:17.505 "data_size": 65536 00:24:17.505 }, 00:24:17.505 { 00:24:17.505 "name": "BaseBdev3", 00:24:17.505 "uuid": "2aab4442-7238-5fbb-a9c0-cd06cf404ef2", 00:24:17.505 "is_configured": true, 00:24:17.505 "data_offset": 0, 00:24:17.505 "data_size": 65536 00:24:17.505 }, 00:24:17.505 { 00:24:17.505 "name": "BaseBdev4", 00:24:17.505 "uuid": "ae9b5ae1-7e29-5714-b60f-870728ba272b", 00:24:17.505 "is_configured": true, 00:24:17.505 "data_offset": 0, 00:24:17.505 "data_size": 65536 00:24:17.505 } 00:24:17.505 ] 00:24:17.505 }' 00:24:17.505 13:42:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.505 13:42:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:18.093 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:18.351 [2024-07-15 13:42:57.578132] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:18.351 [2024-07-15 13:42:57.578161] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:18.351 [2024-07-15 13:42:57.578227] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:18.351 [2024-07-15 13:42:57.578298] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:18.351 [2024-07-15 13:42:57.578310] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x124f8a0 name raid_bdev1, state offline 00:24:18.351 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.351 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:18.610 13:42:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:18.867 /dev/nbd0 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:18.867 1+0 records in 00:24:18.867 1+0 records out 00:24:18.867 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026882 s, 15.2 MB/s 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:18.867 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:19.155 /dev/nbd1 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.155 1+0 records in 00:24:19.155 1+0 records out 00:24:19.155 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314605 s, 13.0 MB/s 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.155 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.413 13:42:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2189730 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2189730 ']' 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2189730 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:19.670 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2189730 00:24:19.928 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:19.928 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:19.928 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2189730' 00:24:19.928 killing process with pid 2189730 00:24:19.928 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2189730 00:24:19.928 Received shutdown signal, test time was about 60.000000 seconds 00:24:19.928 00:24:19.928 Latency(us) 00:24:19.928 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:19.928 =================================================================================================================== 00:24:19.928 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:19.928 [2024-07-15 13:42:59.127136] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:19.928 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2189730 00:24:19.928 [2024-07-15 13:42:59.177924] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:20.185 00:24:20.185 real 0m23.615s 00:24:20.185 user 0m31.982s 00:24:20.185 sys 0m5.030s 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:20.185 ************************************ 00:24:20.185 END TEST raid_rebuild_test 00:24:20.185 ************************************ 00:24:20.185 13:42:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:20.185 13:42:59 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:20.185 13:42:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:20.185 13:42:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:20.185 13:42:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:20.185 ************************************ 00:24:20.185 START TEST raid_rebuild_test_sb 00:24:20.185 ************************************ 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2193016 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2193016 /var/tmp/spdk-raid.sock 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2193016 ']' 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:20.185 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:20.185 13:42:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:20.185 [2024-07-15 13:42:59.566917] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:24:20.185 [2024-07-15 13:42:59.566993] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2193016 ] 00:24:20.185 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:20.185 Zero copy mechanism will not be used. 00:24:20.442 [2024-07-15 13:42:59.698381] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:20.442 [2024-07-15 13:42:59.803556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:20.700 [2024-07-15 13:42:59.869447] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:20.700 [2024-07-15 13:42:59.869484] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:21.266 13:43:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:21.266 13:43:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:21.266 13:43:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:21.266 13:43:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:21.266 BaseBdev1_malloc 00:24:21.266 13:43:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:21.523 [2024-07-15 13:43:00.850171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:21.523 [2024-07-15 13:43:00.850227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.523 [2024-07-15 13:43:00.850253] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1967d40 00:24:21.523 [2024-07-15 13:43:00.850266] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.523 [2024-07-15 13:43:00.851948] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.523 [2024-07-15 13:43:00.851977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:21.523 BaseBdev1 00:24:21.523 13:43:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:21.523 13:43:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:21.780 BaseBdev2_malloc 00:24:21.780 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:22.038 [2024-07-15 13:43:01.352454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:22.038 [2024-07-15 13:43:01.352500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.038 [2024-07-15 13:43:01.352525] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1968860 00:24:22.038 [2024-07-15 13:43:01.352543] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.038 [2024-07-15 13:43:01.353919] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.038 [2024-07-15 13:43:01.353954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:22.038 BaseBdev2 00:24:22.038 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:22.038 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:22.297 BaseBdev3_malloc 00:24:22.297 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:22.555 [2024-07-15 13:43:01.854454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:22.555 [2024-07-15 13:43:01.854503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.555 [2024-07-15 13:43:01.854523] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b158f0 00:24:22.555 [2024-07-15 13:43:01.854536] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.555 [2024-07-15 13:43:01.855912] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.555 [2024-07-15 13:43:01.855950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:22.555 BaseBdev3 00:24:22.555 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:22.555 13:43:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:22.813 BaseBdev4_malloc 00:24:22.813 13:43:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:23.072 [2024-07-15 13:43:02.280194] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:23.072 [2024-07-15 13:43:02.280240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.072 [2024-07-15 13:43:02.280263] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b14ad0 00:24:23.072 [2024-07-15 13:43:02.280276] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.072 [2024-07-15 13:43:02.281723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.072 [2024-07-15 13:43:02.281751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:23.072 BaseBdev4 00:24:23.072 13:43:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:23.330 spare_malloc 00:24:23.330 13:43:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:23.587 spare_delay 00:24:23.587 13:43:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:23.844 [2024-07-15 13:43:03.034697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:23.844 [2024-07-15 13:43:03.034743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:23.844 [2024-07-15 13:43:03.034763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b195b0 00:24:23.844 [2024-07-15 13:43:03.034775] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:23.844 [2024-07-15 13:43:03.036200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:23.844 [2024-07-15 13:43:03.036227] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:23.844 spare 00:24:23.844 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:24.101 [2024-07-15 13:43:03.283380] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:24.101 [2024-07-15 13:43:03.284536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:24.101 [2024-07-15 13:43:03.284588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:24.101 [2024-07-15 13:43:03.284633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:24.101 [2024-07-15 13:43:03.284829] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a988a0 00:24:24.101 [2024-07-15 13:43:03.284841] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:24.101 [2024-07-15 13:43:03.285032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b12e10 00:24:24.101 [2024-07-15 13:43:03.285176] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a988a0 00:24:24.101 [2024-07-15 13:43:03.285187] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a988a0 00:24:24.101 [2024-07-15 13:43:03.285276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.101 "name": "raid_bdev1", 00:24:24.101 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:24.101 "strip_size_kb": 0, 00:24:24.101 "state": "online", 00:24:24.101 "raid_level": "raid1", 00:24:24.101 "superblock": true, 00:24:24.101 "num_base_bdevs": 4, 00:24:24.101 "num_base_bdevs_discovered": 4, 00:24:24.101 "num_base_bdevs_operational": 4, 00:24:24.101 "base_bdevs_list": [ 00:24:24.101 { 00:24:24.101 "name": "BaseBdev1", 00:24:24.101 "uuid": "b4336c0a-1034-5a34-b924-13c75798d29b", 00:24:24.101 "is_configured": true, 00:24:24.101 "data_offset": 2048, 00:24:24.101 "data_size": 63488 00:24:24.101 }, 00:24:24.101 { 00:24:24.101 "name": "BaseBdev2", 00:24:24.101 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:24.101 "is_configured": true, 00:24:24.101 "data_offset": 2048, 00:24:24.101 "data_size": 63488 00:24:24.101 }, 00:24:24.101 { 00:24:24.101 "name": "BaseBdev3", 00:24:24.101 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:24.101 "is_configured": true, 00:24:24.101 "data_offset": 2048, 00:24:24.101 "data_size": 63488 00:24:24.101 }, 00:24:24.101 { 00:24:24.101 "name": "BaseBdev4", 00:24:24.101 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:24.101 "is_configured": true, 00:24:24.101 "data_offset": 2048, 00:24:24.101 "data_size": 63488 00:24:24.101 } 00:24:24.101 ] 00:24:24.101 }' 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.101 13:43:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:24.662 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:24.662 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:24.918 [2024-07-15 13:43:04.290348] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:24.918 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:24.918 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.918 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:25.174 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:25.431 [2024-07-15 13:43:04.787401] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b12e10 00:24:25.431 /dev/nbd0 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:25.431 1+0 records in 00:24:25.431 1+0 records out 00:24:25.431 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272229 s, 15.0 MB/s 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:25.431 13:43:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:33.564 63488+0 records in 00:24:33.564 63488+0 records out 00:24:33.564 32505856 bytes (33 MB, 31 MiB) copied, 7.26353 s, 4.5 MB/s 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:33.564 [2024-07-15 13:43:12.393300] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:33.564 [2024-07-15 13:43:12.553783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.564 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.565 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.565 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.565 "name": "raid_bdev1", 00:24:33.565 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:33.565 "strip_size_kb": 0, 00:24:33.565 "state": "online", 00:24:33.565 "raid_level": "raid1", 00:24:33.565 "superblock": true, 00:24:33.565 "num_base_bdevs": 4, 00:24:33.565 "num_base_bdevs_discovered": 3, 00:24:33.565 "num_base_bdevs_operational": 3, 00:24:33.565 "base_bdevs_list": [ 00:24:33.565 { 00:24:33.565 "name": null, 00:24:33.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.565 "is_configured": false, 00:24:33.565 "data_offset": 2048, 00:24:33.565 "data_size": 63488 00:24:33.565 }, 00:24:33.565 { 00:24:33.565 "name": "BaseBdev2", 00:24:33.565 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:33.565 "is_configured": true, 00:24:33.565 "data_offset": 2048, 00:24:33.565 "data_size": 63488 00:24:33.565 }, 00:24:33.565 { 00:24:33.565 "name": "BaseBdev3", 00:24:33.565 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:33.565 "is_configured": true, 00:24:33.565 "data_offset": 2048, 00:24:33.565 "data_size": 63488 00:24:33.565 }, 00:24:33.565 { 00:24:33.565 "name": "BaseBdev4", 00:24:33.565 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:33.565 "is_configured": true, 00:24:33.565 "data_offset": 2048, 00:24:33.565 "data_size": 63488 00:24:33.565 } 00:24:33.565 ] 00:24:33.565 }' 00:24:33.565 13:43:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.565 13:43:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:34.130 13:43:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:34.388 [2024-07-15 13:43:13.568488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:34.388 [2024-07-15 13:43:13.572614] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b12e10 00:24:34.388 [2024-07-15 13:43:13.574992] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:34.388 13:43:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.319 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.576 "name": "raid_bdev1", 00:24:35.576 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:35.576 "strip_size_kb": 0, 00:24:35.576 "state": "online", 00:24:35.576 "raid_level": "raid1", 00:24:35.576 "superblock": true, 00:24:35.576 "num_base_bdevs": 4, 00:24:35.576 "num_base_bdevs_discovered": 4, 00:24:35.576 "num_base_bdevs_operational": 4, 00:24:35.576 "process": { 00:24:35.576 "type": "rebuild", 00:24:35.576 "target": "spare", 00:24:35.576 "progress": { 00:24:35.576 "blocks": 24576, 00:24:35.576 "percent": 38 00:24:35.576 } 00:24:35.576 }, 00:24:35.576 "base_bdevs_list": [ 00:24:35.576 { 00:24:35.576 "name": "spare", 00:24:35.576 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:35.576 "is_configured": true, 00:24:35.576 "data_offset": 2048, 00:24:35.576 "data_size": 63488 00:24:35.576 }, 00:24:35.576 { 00:24:35.576 "name": "BaseBdev2", 00:24:35.576 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:35.576 "is_configured": true, 00:24:35.576 "data_offset": 2048, 00:24:35.576 "data_size": 63488 00:24:35.576 }, 00:24:35.576 { 00:24:35.576 "name": "BaseBdev3", 00:24:35.576 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:35.576 "is_configured": true, 00:24:35.576 "data_offset": 2048, 00:24:35.576 "data_size": 63488 00:24:35.576 }, 00:24:35.576 { 00:24:35.576 "name": "BaseBdev4", 00:24:35.576 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:35.576 "is_configured": true, 00:24:35.576 "data_offset": 2048, 00:24:35.576 "data_size": 63488 00:24:35.576 } 00:24:35.576 ] 00:24:35.576 }' 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:35.576 13:43:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:35.834 [2024-07-15 13:43:15.162286] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.834 [2024-07-15 13:43:15.187678] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:35.834 [2024-07-15 13:43:15.187732] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.834 [2024-07-15 13:43:15.187749] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.834 [2024-07-15 13:43:15.187758] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.834 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.092 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.092 "name": "raid_bdev1", 00:24:36.092 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:36.092 "strip_size_kb": 0, 00:24:36.092 "state": "online", 00:24:36.092 "raid_level": "raid1", 00:24:36.092 "superblock": true, 00:24:36.092 "num_base_bdevs": 4, 00:24:36.092 "num_base_bdevs_discovered": 3, 00:24:36.092 "num_base_bdevs_operational": 3, 00:24:36.092 "base_bdevs_list": [ 00:24:36.092 { 00:24:36.092 "name": null, 00:24:36.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.092 "is_configured": false, 00:24:36.092 "data_offset": 2048, 00:24:36.092 "data_size": 63488 00:24:36.092 }, 00:24:36.092 { 00:24:36.092 "name": "BaseBdev2", 00:24:36.092 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:36.092 "is_configured": true, 00:24:36.092 "data_offset": 2048, 00:24:36.092 "data_size": 63488 00:24:36.092 }, 00:24:36.092 { 00:24:36.092 "name": "BaseBdev3", 00:24:36.092 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:36.092 "is_configured": true, 00:24:36.092 "data_offset": 2048, 00:24:36.092 "data_size": 63488 00:24:36.092 }, 00:24:36.092 { 00:24:36.092 "name": "BaseBdev4", 00:24:36.092 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:36.092 "is_configured": true, 00:24:36.092 "data_offset": 2048, 00:24:36.092 "data_size": 63488 00:24:36.092 } 00:24:36.092 ] 00:24:36.092 }' 00:24:36.092 13:43:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.092 13:43:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.657 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.913 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:36.913 "name": "raid_bdev1", 00:24:36.914 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:36.914 "strip_size_kb": 0, 00:24:36.914 "state": "online", 00:24:36.914 "raid_level": "raid1", 00:24:36.914 "superblock": true, 00:24:36.914 "num_base_bdevs": 4, 00:24:36.914 "num_base_bdevs_discovered": 3, 00:24:36.914 "num_base_bdevs_operational": 3, 00:24:36.914 "base_bdevs_list": [ 00:24:36.914 { 00:24:36.914 "name": null, 00:24:36.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.914 "is_configured": false, 00:24:36.914 "data_offset": 2048, 00:24:36.914 "data_size": 63488 00:24:36.914 }, 00:24:36.914 { 00:24:36.914 "name": "BaseBdev2", 00:24:36.914 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:36.914 "is_configured": true, 00:24:36.914 "data_offset": 2048, 00:24:36.914 "data_size": 63488 00:24:36.914 }, 00:24:36.914 { 00:24:36.914 "name": "BaseBdev3", 00:24:36.914 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:36.914 "is_configured": true, 00:24:36.914 "data_offset": 2048, 00:24:36.914 "data_size": 63488 00:24:36.914 }, 00:24:36.914 { 00:24:36.914 "name": "BaseBdev4", 00:24:36.914 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:36.914 "is_configured": true, 00:24:36.914 "data_offset": 2048, 00:24:36.914 "data_size": 63488 00:24:36.914 } 00:24:36.914 ] 00:24:36.914 }' 00:24:36.914 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:37.171 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:37.171 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.171 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:37.171 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:37.429 [2024-07-15 13:43:16.623538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.429 [2024-07-15 13:43:16.628231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a98e90 00:24:37.429 [2024-07-15 13:43:16.629787] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:37.429 13:43:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.363 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.621 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.621 "name": "raid_bdev1", 00:24:38.621 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:38.621 "strip_size_kb": 0, 00:24:38.621 "state": "online", 00:24:38.621 "raid_level": "raid1", 00:24:38.621 "superblock": true, 00:24:38.621 "num_base_bdevs": 4, 00:24:38.621 "num_base_bdevs_discovered": 4, 00:24:38.621 "num_base_bdevs_operational": 4, 00:24:38.621 "process": { 00:24:38.621 "type": "rebuild", 00:24:38.621 "target": "spare", 00:24:38.621 "progress": { 00:24:38.621 "blocks": 24576, 00:24:38.621 "percent": 38 00:24:38.621 } 00:24:38.621 }, 00:24:38.621 "base_bdevs_list": [ 00:24:38.621 { 00:24:38.621 "name": "spare", 00:24:38.622 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:38.622 "is_configured": true, 00:24:38.622 "data_offset": 2048, 00:24:38.622 "data_size": 63488 00:24:38.622 }, 00:24:38.622 { 00:24:38.622 "name": "BaseBdev2", 00:24:38.622 "uuid": "8d0804d6-46f1-5868-baea-75785e394ce4", 00:24:38.622 "is_configured": true, 00:24:38.622 "data_offset": 2048, 00:24:38.622 "data_size": 63488 00:24:38.622 }, 00:24:38.622 { 00:24:38.622 "name": "BaseBdev3", 00:24:38.622 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:38.622 "is_configured": true, 00:24:38.622 "data_offset": 2048, 00:24:38.622 "data_size": 63488 00:24:38.622 }, 00:24:38.622 { 00:24:38.622 "name": "BaseBdev4", 00:24:38.622 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:38.622 "is_configured": true, 00:24:38.622 "data_offset": 2048, 00:24:38.622 "data_size": 63488 00:24:38.622 } 00:24:38.622 ] 00:24:38.622 }' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:38.622 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:38.622 13:43:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:38.880 [2024-07-15 13:43:18.221691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:39.138 [2024-07-15 13:43:18.342546] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1a98e90 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.138 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.395 "name": "raid_bdev1", 00:24:39.395 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:39.395 "strip_size_kb": 0, 00:24:39.395 "state": "online", 00:24:39.395 "raid_level": "raid1", 00:24:39.395 "superblock": true, 00:24:39.395 "num_base_bdevs": 4, 00:24:39.395 "num_base_bdevs_discovered": 3, 00:24:39.395 "num_base_bdevs_operational": 3, 00:24:39.395 "process": { 00:24:39.395 "type": "rebuild", 00:24:39.395 "target": "spare", 00:24:39.395 "progress": { 00:24:39.395 "blocks": 36864, 00:24:39.395 "percent": 58 00:24:39.395 } 00:24:39.395 }, 00:24:39.395 "base_bdevs_list": [ 00:24:39.395 { 00:24:39.395 "name": "spare", 00:24:39.395 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:39.395 "is_configured": true, 00:24:39.395 "data_offset": 2048, 00:24:39.395 "data_size": 63488 00:24:39.395 }, 00:24:39.395 { 00:24:39.395 "name": null, 00:24:39.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.395 "is_configured": false, 00:24:39.395 "data_offset": 2048, 00:24:39.395 "data_size": 63488 00:24:39.395 }, 00:24:39.395 { 00:24:39.395 "name": "BaseBdev3", 00:24:39.395 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:39.395 "is_configured": true, 00:24:39.395 "data_offset": 2048, 00:24:39.395 "data_size": 63488 00:24:39.395 }, 00:24:39.395 { 00:24:39.395 "name": "BaseBdev4", 00:24:39.395 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:39.395 "is_configured": true, 00:24:39.395 "data_offset": 2048, 00:24:39.395 "data_size": 63488 00:24:39.395 } 00:24:39.395 ] 00:24:39.395 }' 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=902 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.395 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.652 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.652 "name": "raid_bdev1", 00:24:39.652 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:39.652 "strip_size_kb": 0, 00:24:39.652 "state": "online", 00:24:39.652 "raid_level": "raid1", 00:24:39.652 "superblock": true, 00:24:39.652 "num_base_bdevs": 4, 00:24:39.652 "num_base_bdevs_discovered": 3, 00:24:39.652 "num_base_bdevs_operational": 3, 00:24:39.652 "process": { 00:24:39.652 "type": "rebuild", 00:24:39.652 "target": "spare", 00:24:39.652 "progress": { 00:24:39.652 "blocks": 43008, 00:24:39.652 "percent": 67 00:24:39.652 } 00:24:39.652 }, 00:24:39.652 "base_bdevs_list": [ 00:24:39.652 { 00:24:39.652 "name": "spare", 00:24:39.652 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:39.652 "is_configured": true, 00:24:39.652 "data_offset": 2048, 00:24:39.652 "data_size": 63488 00:24:39.652 }, 00:24:39.652 { 00:24:39.652 "name": null, 00:24:39.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.652 "is_configured": false, 00:24:39.652 "data_offset": 2048, 00:24:39.652 "data_size": 63488 00:24:39.652 }, 00:24:39.652 { 00:24:39.652 "name": "BaseBdev3", 00:24:39.652 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:39.652 "is_configured": true, 00:24:39.652 "data_offset": 2048, 00:24:39.652 "data_size": 63488 00:24:39.652 }, 00:24:39.652 { 00:24:39.652 "name": "BaseBdev4", 00:24:39.652 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:39.652 "is_configured": true, 00:24:39.652 "data_offset": 2048, 00:24:39.652 "data_size": 63488 00:24:39.652 } 00:24:39.652 ] 00:24:39.652 }' 00:24:39.652 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.652 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.652 13:43:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.652 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.652 13:43:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:40.584 [2024-07-15 13:43:19.854328] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:40.584 [2024-07-15 13:43:19.854391] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:40.584 [2024-07-15 13:43:19.854486] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.874 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.147 "name": "raid_bdev1", 00:24:41.147 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:41.147 "strip_size_kb": 0, 00:24:41.147 "state": "online", 00:24:41.147 "raid_level": "raid1", 00:24:41.147 "superblock": true, 00:24:41.147 "num_base_bdevs": 4, 00:24:41.147 "num_base_bdevs_discovered": 3, 00:24:41.147 "num_base_bdevs_operational": 3, 00:24:41.147 "base_bdevs_list": [ 00:24:41.147 { 00:24:41.147 "name": "spare", 00:24:41.147 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:41.147 "is_configured": true, 00:24:41.147 "data_offset": 2048, 00:24:41.147 "data_size": 63488 00:24:41.147 }, 00:24:41.147 { 00:24:41.147 "name": null, 00:24:41.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.147 "is_configured": false, 00:24:41.147 "data_offset": 2048, 00:24:41.147 "data_size": 63488 00:24:41.147 }, 00:24:41.147 { 00:24:41.147 "name": "BaseBdev3", 00:24:41.147 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:41.147 "is_configured": true, 00:24:41.147 "data_offset": 2048, 00:24:41.147 "data_size": 63488 00:24:41.147 }, 00:24:41.147 { 00:24:41.147 "name": "BaseBdev4", 00:24:41.147 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:41.147 "is_configured": true, 00:24:41.147 "data_offset": 2048, 00:24:41.147 "data_size": 63488 00:24:41.147 } 00:24:41.147 ] 00:24:41.147 }' 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.147 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.404 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.405 "name": "raid_bdev1", 00:24:41.405 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:41.405 "strip_size_kb": 0, 00:24:41.405 "state": "online", 00:24:41.405 "raid_level": "raid1", 00:24:41.405 "superblock": true, 00:24:41.405 "num_base_bdevs": 4, 00:24:41.405 "num_base_bdevs_discovered": 3, 00:24:41.405 "num_base_bdevs_operational": 3, 00:24:41.405 "base_bdevs_list": [ 00:24:41.405 { 00:24:41.405 "name": "spare", 00:24:41.405 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:41.405 "is_configured": true, 00:24:41.405 "data_offset": 2048, 00:24:41.405 "data_size": 63488 00:24:41.405 }, 00:24:41.405 { 00:24:41.405 "name": null, 00:24:41.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.405 "is_configured": false, 00:24:41.405 "data_offset": 2048, 00:24:41.405 "data_size": 63488 00:24:41.405 }, 00:24:41.405 { 00:24:41.405 "name": "BaseBdev3", 00:24:41.405 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:41.405 "is_configured": true, 00:24:41.405 "data_offset": 2048, 00:24:41.405 "data_size": 63488 00:24:41.405 }, 00:24:41.405 { 00:24:41.405 "name": "BaseBdev4", 00:24:41.405 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:41.405 "is_configured": true, 00:24:41.405 "data_offset": 2048, 00:24:41.405 "data_size": 63488 00:24:41.405 } 00:24:41.405 ] 00:24:41.405 }' 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.405 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.662 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.662 "name": "raid_bdev1", 00:24:41.662 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:41.662 "strip_size_kb": 0, 00:24:41.662 "state": "online", 00:24:41.662 "raid_level": "raid1", 00:24:41.662 "superblock": true, 00:24:41.662 "num_base_bdevs": 4, 00:24:41.662 "num_base_bdevs_discovered": 3, 00:24:41.662 "num_base_bdevs_operational": 3, 00:24:41.662 "base_bdevs_list": [ 00:24:41.662 { 00:24:41.662 "name": "spare", 00:24:41.662 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:41.662 "is_configured": true, 00:24:41.662 "data_offset": 2048, 00:24:41.662 "data_size": 63488 00:24:41.662 }, 00:24:41.662 { 00:24:41.662 "name": null, 00:24:41.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.662 "is_configured": false, 00:24:41.662 "data_offset": 2048, 00:24:41.662 "data_size": 63488 00:24:41.662 }, 00:24:41.662 { 00:24:41.662 "name": "BaseBdev3", 00:24:41.662 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:41.662 "is_configured": true, 00:24:41.662 "data_offset": 2048, 00:24:41.662 "data_size": 63488 00:24:41.662 }, 00:24:41.662 { 00:24:41.662 "name": "BaseBdev4", 00:24:41.662 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:41.662 "is_configured": true, 00:24:41.662 "data_offset": 2048, 00:24:41.662 "data_size": 63488 00:24:41.662 } 00:24:41.662 ] 00:24:41.662 }' 00:24:41.662 13:43:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.662 13:43:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.225 13:43:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:42.483 [2024-07-15 13:43:21.787082] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:42.483 [2024-07-15 13:43:21.787114] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:42.483 [2024-07-15 13:43:21.787180] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:42.483 [2024-07-15 13:43:21.787252] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:42.483 [2024-07-15 13:43:21.787264] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a988a0 name raid_bdev1, state offline 00:24:42.483 13:43:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.483 13:43:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:42.741 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:43.306 /dev/nbd0 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.306 1+0 records in 00:24:43.306 1+0 records out 00:24:43.306 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200413 s, 20.4 MB/s 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:43.306 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:43.565 /dev/nbd1 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:43.565 1+0 records in 00:24:43.565 1+0 records out 00:24:43.565 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000323036 s, 12.7 MB/s 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.565 13:43:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:43.822 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:44.386 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:44.386 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:44.387 13:43:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:44.644 [2024-07-15 13:43:23.996270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:44.644 [2024-07-15 13:43:23.996323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:44.644 [2024-07-15 13:43:23.996345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b12b40 00:24:44.644 [2024-07-15 13:43:23.996359] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:44.644 [2024-07-15 13:43:23.998017] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:44.644 [2024-07-15 13:43:23.998046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:44.644 [2024-07-15 13:43:23.998130] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:44.644 [2024-07-15 13:43:23.998159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:44.644 [2024-07-15 13:43:23.998269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:44.644 [2024-07-15 13:43:23.998342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:44.644 spare 00:24:44.644 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.645 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.903 [2024-07-15 13:43:24.098670] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a9cba0 00:24:44.903 [2024-07-15 13:43:24.098690] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:44.903 [2024-07-15 13:43:24.098909] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a99560 00:24:44.903 [2024-07-15 13:43:24.099077] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a9cba0 00:24:44.903 [2024-07-15 13:43:24.099088] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a9cba0 00:24:44.903 [2024-07-15 13:43:24.099198] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.903 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.903 "name": "raid_bdev1", 00:24:44.903 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:44.903 "strip_size_kb": 0, 00:24:44.903 "state": "online", 00:24:44.903 "raid_level": "raid1", 00:24:44.903 "superblock": true, 00:24:44.903 "num_base_bdevs": 4, 00:24:44.903 "num_base_bdevs_discovered": 3, 00:24:44.903 "num_base_bdevs_operational": 3, 00:24:44.903 "base_bdevs_list": [ 00:24:44.903 { 00:24:44.903 "name": "spare", 00:24:44.903 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:44.903 "is_configured": true, 00:24:44.903 "data_offset": 2048, 00:24:44.903 "data_size": 63488 00:24:44.903 }, 00:24:44.903 { 00:24:44.903 "name": null, 00:24:44.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.903 "is_configured": false, 00:24:44.903 "data_offset": 2048, 00:24:44.903 "data_size": 63488 00:24:44.903 }, 00:24:44.903 { 00:24:44.903 "name": "BaseBdev3", 00:24:44.903 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:44.903 "is_configured": true, 00:24:44.903 "data_offset": 2048, 00:24:44.903 "data_size": 63488 00:24:44.903 }, 00:24:44.903 { 00:24:44.903 "name": "BaseBdev4", 00:24:44.903 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:44.903 "is_configured": true, 00:24:44.903 "data_offset": 2048, 00:24:44.903 "data_size": 63488 00:24:44.903 } 00:24:44.903 ] 00:24:44.903 }' 00:24:44.903 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.903 13:43:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.470 13:43:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.729 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.729 "name": "raid_bdev1", 00:24:45.729 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:45.729 "strip_size_kb": 0, 00:24:45.729 "state": "online", 00:24:45.729 "raid_level": "raid1", 00:24:45.729 "superblock": true, 00:24:45.729 "num_base_bdevs": 4, 00:24:45.729 "num_base_bdevs_discovered": 3, 00:24:45.729 "num_base_bdevs_operational": 3, 00:24:45.729 "base_bdevs_list": [ 00:24:45.729 { 00:24:45.729 "name": "spare", 00:24:45.729 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:45.729 "is_configured": true, 00:24:45.729 "data_offset": 2048, 00:24:45.729 "data_size": 63488 00:24:45.729 }, 00:24:45.729 { 00:24:45.729 "name": null, 00:24:45.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.729 "is_configured": false, 00:24:45.729 "data_offset": 2048, 00:24:45.729 "data_size": 63488 00:24:45.729 }, 00:24:45.729 { 00:24:45.729 "name": "BaseBdev3", 00:24:45.729 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:45.729 "is_configured": true, 00:24:45.729 "data_offset": 2048, 00:24:45.729 "data_size": 63488 00:24:45.729 }, 00:24:45.729 { 00:24:45.729 "name": "BaseBdev4", 00:24:45.729 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:45.729 "is_configured": true, 00:24:45.729 "data_offset": 2048, 00:24:45.729 "data_size": 63488 00:24:45.729 } 00:24:45.729 ] 00:24:45.729 }' 00:24:45.729 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.987 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.987 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.987 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.987 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.987 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:46.245 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.246 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:46.246 [2024-07-15 13:43:25.668840] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.504 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.762 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.762 "name": "raid_bdev1", 00:24:46.762 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:46.762 "strip_size_kb": 0, 00:24:46.762 "state": "online", 00:24:46.762 "raid_level": "raid1", 00:24:46.762 "superblock": true, 00:24:46.762 "num_base_bdevs": 4, 00:24:46.762 "num_base_bdevs_discovered": 2, 00:24:46.762 "num_base_bdevs_operational": 2, 00:24:46.762 "base_bdevs_list": [ 00:24:46.762 { 00:24:46.762 "name": null, 00:24:46.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.762 "is_configured": false, 00:24:46.762 "data_offset": 2048, 00:24:46.762 "data_size": 63488 00:24:46.762 }, 00:24:46.762 { 00:24:46.762 "name": null, 00:24:46.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.762 "is_configured": false, 00:24:46.762 "data_offset": 2048, 00:24:46.762 "data_size": 63488 00:24:46.762 }, 00:24:46.762 { 00:24:46.762 "name": "BaseBdev3", 00:24:46.762 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:46.762 "is_configured": true, 00:24:46.762 "data_offset": 2048, 00:24:46.762 "data_size": 63488 00:24:46.762 }, 00:24:46.762 { 00:24:46.762 "name": "BaseBdev4", 00:24:46.762 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:46.762 "is_configured": true, 00:24:46.762 "data_offset": 2048, 00:24:46.762 "data_size": 63488 00:24:46.762 } 00:24:46.762 ] 00:24:46.762 }' 00:24:46.762 13:43:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.762 13:43:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:47.329 13:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:47.615 [2024-07-15 13:43:26.771794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:47.615 [2024-07-15 13:43:26.771968] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:47.615 [2024-07-15 13:43:26.771986] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:47.615 [2024-07-15 13:43:26.772017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:47.615 [2024-07-15 13:43:26.776048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a9c740 00:24:47.615 [2024-07-15 13:43:26.778491] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:47.615 13:43:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.549 13:43:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.807 "name": "raid_bdev1", 00:24:48.807 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:48.807 "strip_size_kb": 0, 00:24:48.807 "state": "online", 00:24:48.807 "raid_level": "raid1", 00:24:48.807 "superblock": true, 00:24:48.807 "num_base_bdevs": 4, 00:24:48.807 "num_base_bdevs_discovered": 3, 00:24:48.807 "num_base_bdevs_operational": 3, 00:24:48.807 "process": { 00:24:48.807 "type": "rebuild", 00:24:48.807 "target": "spare", 00:24:48.807 "progress": { 00:24:48.807 "blocks": 24576, 00:24:48.807 "percent": 38 00:24:48.807 } 00:24:48.807 }, 00:24:48.807 "base_bdevs_list": [ 00:24:48.807 { 00:24:48.807 "name": "spare", 00:24:48.807 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:48.807 "is_configured": true, 00:24:48.807 "data_offset": 2048, 00:24:48.807 "data_size": 63488 00:24:48.807 }, 00:24:48.807 { 00:24:48.807 "name": null, 00:24:48.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.807 "is_configured": false, 00:24:48.807 "data_offset": 2048, 00:24:48.807 "data_size": 63488 00:24:48.807 }, 00:24:48.807 { 00:24:48.807 "name": "BaseBdev3", 00:24:48.807 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:48.807 "is_configured": true, 00:24:48.807 "data_offset": 2048, 00:24:48.807 "data_size": 63488 00:24:48.807 }, 00:24:48.807 { 00:24:48.807 "name": "BaseBdev4", 00:24:48.807 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:48.807 "is_configured": true, 00:24:48.807 "data_offset": 2048, 00:24:48.807 "data_size": 63488 00:24:48.807 } 00:24:48.807 ] 00:24:48.807 }' 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.807 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:49.065 [2024-07-15 13:43:28.352909] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:49.065 [2024-07-15 13:43:28.391567] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:49.065 [2024-07-15 13:43:28.391613] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.065 [2024-07-15 13:43:28.391629] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:49.065 [2024-07-15 13:43:28.391637] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:49.065 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:49.065 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.066 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.323 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.324 "name": "raid_bdev1", 00:24:49.324 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:49.324 "strip_size_kb": 0, 00:24:49.324 "state": "online", 00:24:49.324 "raid_level": "raid1", 00:24:49.324 "superblock": true, 00:24:49.324 "num_base_bdevs": 4, 00:24:49.324 "num_base_bdevs_discovered": 2, 00:24:49.324 "num_base_bdevs_operational": 2, 00:24:49.324 "base_bdevs_list": [ 00:24:49.324 { 00:24:49.324 "name": null, 00:24:49.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.324 "is_configured": false, 00:24:49.324 "data_offset": 2048, 00:24:49.324 "data_size": 63488 00:24:49.324 }, 00:24:49.324 { 00:24:49.324 "name": null, 00:24:49.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.324 "is_configured": false, 00:24:49.324 "data_offset": 2048, 00:24:49.324 "data_size": 63488 00:24:49.324 }, 00:24:49.324 { 00:24:49.324 "name": "BaseBdev3", 00:24:49.324 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:49.324 "is_configured": true, 00:24:49.324 "data_offset": 2048, 00:24:49.324 "data_size": 63488 00:24:49.324 }, 00:24:49.324 { 00:24:49.324 "name": "BaseBdev4", 00:24:49.324 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:49.324 "is_configured": true, 00:24:49.324 "data_offset": 2048, 00:24:49.324 "data_size": 63488 00:24:49.324 } 00:24:49.324 ] 00:24:49.324 }' 00:24:49.324 13:43:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.324 13:43:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:49.890 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:50.149 [2024-07-15 13:43:29.442715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:50.149 [2024-07-15 13:43:29.442769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.149 [2024-07-15 13:43:29.442794] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a9d010 00:24:50.149 [2024-07-15 13:43:29.442807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.149 [2024-07-15 13:43:29.443205] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.149 [2024-07-15 13:43:29.443223] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:50.149 [2024-07-15 13:43:29.443310] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:50.149 [2024-07-15 13:43:29.443328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:50.149 [2024-07-15 13:43:29.443339] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:50.149 [2024-07-15 13:43:29.443360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:50.149 [2024-07-15 13:43:29.447351] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b18420 00:24:50.149 spare 00:24:50.149 [2024-07-15 13:43:29.448839] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:50.149 13:43:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.085 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.344 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.344 "name": "raid_bdev1", 00:24:51.344 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:51.344 "strip_size_kb": 0, 00:24:51.344 "state": "online", 00:24:51.344 "raid_level": "raid1", 00:24:51.344 "superblock": true, 00:24:51.344 "num_base_bdevs": 4, 00:24:51.344 "num_base_bdevs_discovered": 3, 00:24:51.344 "num_base_bdevs_operational": 3, 00:24:51.344 "process": { 00:24:51.344 "type": "rebuild", 00:24:51.344 "target": "spare", 00:24:51.344 "progress": { 00:24:51.344 "blocks": 24576, 00:24:51.344 "percent": 38 00:24:51.344 } 00:24:51.344 }, 00:24:51.344 "base_bdevs_list": [ 00:24:51.344 { 00:24:51.344 "name": "spare", 00:24:51.344 "uuid": "b9bdb677-98f2-5896-888d-c60f971e080a", 00:24:51.344 "is_configured": true, 00:24:51.344 "data_offset": 2048, 00:24:51.344 "data_size": 63488 00:24:51.344 }, 00:24:51.344 { 00:24:51.344 "name": null, 00:24:51.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.344 "is_configured": false, 00:24:51.344 "data_offset": 2048, 00:24:51.344 "data_size": 63488 00:24:51.344 }, 00:24:51.344 { 00:24:51.344 "name": "BaseBdev3", 00:24:51.344 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:51.344 "is_configured": true, 00:24:51.344 "data_offset": 2048, 00:24:51.344 "data_size": 63488 00:24:51.344 }, 00:24:51.344 { 00:24:51.344 "name": "BaseBdev4", 00:24:51.344 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:51.344 "is_configured": true, 00:24:51.344 "data_offset": 2048, 00:24:51.344 "data_size": 63488 00:24:51.344 } 00:24:51.344 ] 00:24:51.344 }' 00:24:51.344 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.344 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:51.344 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.602 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:51.602 13:43:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:51.860 [2024-07-15 13:43:31.028214] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.860 [2024-07-15 13:43:31.061604] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:51.860 [2024-07-15 13:43:31.061654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.860 [2024-07-15 13:43:31.061671] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:51.860 [2024-07-15 13:43:31.061680] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:51.860 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.861 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.120 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.120 "name": "raid_bdev1", 00:24:52.120 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:52.120 "strip_size_kb": 0, 00:24:52.120 "state": "online", 00:24:52.120 "raid_level": "raid1", 00:24:52.120 "superblock": true, 00:24:52.120 "num_base_bdevs": 4, 00:24:52.120 "num_base_bdevs_discovered": 2, 00:24:52.120 "num_base_bdevs_operational": 2, 00:24:52.120 "base_bdevs_list": [ 00:24:52.120 { 00:24:52.120 "name": null, 00:24:52.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.120 "is_configured": false, 00:24:52.120 "data_offset": 2048, 00:24:52.120 "data_size": 63488 00:24:52.120 }, 00:24:52.120 { 00:24:52.120 "name": null, 00:24:52.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.120 "is_configured": false, 00:24:52.120 "data_offset": 2048, 00:24:52.120 "data_size": 63488 00:24:52.120 }, 00:24:52.120 { 00:24:52.120 "name": "BaseBdev3", 00:24:52.120 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:52.120 "is_configured": true, 00:24:52.120 "data_offset": 2048, 00:24:52.120 "data_size": 63488 00:24:52.120 }, 00:24:52.120 { 00:24:52.120 "name": "BaseBdev4", 00:24:52.120 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:52.120 "is_configured": true, 00:24:52.120 "data_offset": 2048, 00:24:52.120 "data_size": 63488 00:24:52.120 } 00:24:52.120 ] 00:24:52.120 }' 00:24:52.120 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.120 13:43:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.686 13:43:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:52.946 "name": "raid_bdev1", 00:24:52.946 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:52.946 "strip_size_kb": 0, 00:24:52.946 "state": "online", 00:24:52.946 "raid_level": "raid1", 00:24:52.946 "superblock": true, 00:24:52.946 "num_base_bdevs": 4, 00:24:52.946 "num_base_bdevs_discovered": 2, 00:24:52.946 "num_base_bdevs_operational": 2, 00:24:52.946 "base_bdevs_list": [ 00:24:52.946 { 00:24:52.946 "name": null, 00:24:52.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.946 "is_configured": false, 00:24:52.946 "data_offset": 2048, 00:24:52.946 "data_size": 63488 00:24:52.946 }, 00:24:52.946 { 00:24:52.946 "name": null, 00:24:52.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.946 "is_configured": false, 00:24:52.946 "data_offset": 2048, 00:24:52.946 "data_size": 63488 00:24:52.946 }, 00:24:52.946 { 00:24:52.946 "name": "BaseBdev3", 00:24:52.946 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:52.946 "is_configured": true, 00:24:52.946 "data_offset": 2048, 00:24:52.946 "data_size": 63488 00:24:52.946 }, 00:24:52.946 { 00:24:52.946 "name": "BaseBdev4", 00:24:52.946 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:52.946 "is_configured": true, 00:24:52.946 "data_offset": 2048, 00:24:52.946 "data_size": 63488 00:24:52.946 } 00:24:52.946 ] 00:24:52.946 }' 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:52.946 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:53.204 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:53.463 [2024-07-15 13:43:32.746107] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:53.463 [2024-07-15 13:43:32.746159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:53.463 [2024-07-15 13:43:32.746182] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b18e30 00:24:53.463 [2024-07-15 13:43:32.746195] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:53.463 [2024-07-15 13:43:32.746549] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:53.463 [2024-07-15 13:43:32.746566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:53.463 [2024-07-15 13:43:32.746632] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:53.463 [2024-07-15 13:43:32.746643] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:53.463 [2024-07-15 13:43:32.746655] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:53.463 BaseBdev1 00:24:53.463 13:43:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.397 13:43:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.654 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.654 "name": "raid_bdev1", 00:24:54.654 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:54.654 "strip_size_kb": 0, 00:24:54.654 "state": "online", 00:24:54.654 "raid_level": "raid1", 00:24:54.654 "superblock": true, 00:24:54.654 "num_base_bdevs": 4, 00:24:54.654 "num_base_bdevs_discovered": 2, 00:24:54.654 "num_base_bdevs_operational": 2, 00:24:54.654 "base_bdevs_list": [ 00:24:54.654 { 00:24:54.654 "name": null, 00:24:54.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.654 "is_configured": false, 00:24:54.654 "data_offset": 2048, 00:24:54.654 "data_size": 63488 00:24:54.654 }, 00:24:54.654 { 00:24:54.654 "name": null, 00:24:54.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.654 "is_configured": false, 00:24:54.654 "data_offset": 2048, 00:24:54.654 "data_size": 63488 00:24:54.654 }, 00:24:54.654 { 00:24:54.654 "name": "BaseBdev3", 00:24:54.654 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:54.654 "is_configured": true, 00:24:54.654 "data_offset": 2048, 00:24:54.654 "data_size": 63488 00:24:54.654 }, 00:24:54.654 { 00:24:54.654 "name": "BaseBdev4", 00:24:54.654 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:54.654 "is_configured": true, 00:24:54.654 "data_offset": 2048, 00:24:54.654 "data_size": 63488 00:24:54.654 } 00:24:54.654 ] 00:24:54.654 }' 00:24:54.654 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.654 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:55.219 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:55.219 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.220 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:55.220 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:55.220 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.220 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.220 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.478 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.478 "name": "raid_bdev1", 00:24:55.478 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:55.478 "strip_size_kb": 0, 00:24:55.478 "state": "online", 00:24:55.478 "raid_level": "raid1", 00:24:55.478 "superblock": true, 00:24:55.478 "num_base_bdevs": 4, 00:24:55.478 "num_base_bdevs_discovered": 2, 00:24:55.478 "num_base_bdevs_operational": 2, 00:24:55.478 "base_bdevs_list": [ 00:24:55.478 { 00:24:55.478 "name": null, 00:24:55.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.478 "is_configured": false, 00:24:55.478 "data_offset": 2048, 00:24:55.478 "data_size": 63488 00:24:55.478 }, 00:24:55.478 { 00:24:55.478 "name": null, 00:24:55.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.478 "is_configured": false, 00:24:55.478 "data_offset": 2048, 00:24:55.478 "data_size": 63488 00:24:55.478 }, 00:24:55.478 { 00:24:55.478 "name": "BaseBdev3", 00:24:55.478 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:55.478 "is_configured": true, 00:24:55.478 "data_offset": 2048, 00:24:55.478 "data_size": 63488 00:24:55.478 }, 00:24:55.478 { 00:24:55.478 "name": "BaseBdev4", 00:24:55.478 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:55.478 "is_configured": true, 00:24:55.478 "data_offset": 2048, 00:24:55.478 "data_size": 63488 00:24:55.478 } 00:24:55.478 ] 00:24:55.478 }' 00:24:55.478 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:55.737 13:43:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:56.030 [2024-07-15 13:43:35.184858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:56.030 [2024-07-15 13:43:35.185000] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:56.030 [2024-07-15 13:43:35.185017] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:56.030 request: 00:24:56.030 { 00:24:56.030 "base_bdev": "BaseBdev1", 00:24:56.030 "raid_bdev": "raid_bdev1", 00:24:56.030 "method": "bdev_raid_add_base_bdev", 00:24:56.030 "req_id": 1 00:24:56.030 } 00:24:56.030 Got JSON-RPC error response 00:24:56.030 response: 00:24:56.030 { 00:24:56.030 "code": -22, 00:24:56.030 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:56.030 } 00:24:56.030 13:43:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:56.030 13:43:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:56.030 13:43:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:56.030 13:43:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:56.030 13:43:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.962 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.220 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.220 "name": "raid_bdev1", 00:24:57.220 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:57.220 "strip_size_kb": 0, 00:24:57.220 "state": "online", 00:24:57.220 "raid_level": "raid1", 00:24:57.220 "superblock": true, 00:24:57.220 "num_base_bdevs": 4, 00:24:57.220 "num_base_bdevs_discovered": 2, 00:24:57.220 "num_base_bdevs_operational": 2, 00:24:57.220 "base_bdevs_list": [ 00:24:57.220 { 00:24:57.220 "name": null, 00:24:57.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.220 "is_configured": false, 00:24:57.220 "data_offset": 2048, 00:24:57.220 "data_size": 63488 00:24:57.220 }, 00:24:57.220 { 00:24:57.220 "name": null, 00:24:57.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.220 "is_configured": false, 00:24:57.220 "data_offset": 2048, 00:24:57.220 "data_size": 63488 00:24:57.220 }, 00:24:57.220 { 00:24:57.220 "name": "BaseBdev3", 00:24:57.220 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:57.220 "is_configured": true, 00:24:57.220 "data_offset": 2048, 00:24:57.220 "data_size": 63488 00:24:57.220 }, 00:24:57.220 { 00:24:57.220 "name": "BaseBdev4", 00:24:57.220 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:57.220 "is_configured": true, 00:24:57.220 "data_offset": 2048, 00:24:57.220 "data_size": 63488 00:24:57.220 } 00:24:57.220 ] 00:24:57.220 }' 00:24:57.220 13:43:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.220 13:43:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.787 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.080 "name": "raid_bdev1", 00:24:58.080 "uuid": "05d72578-fc1f-425d-8b9b-e3845d131080", 00:24:58.080 "strip_size_kb": 0, 00:24:58.080 "state": "online", 00:24:58.080 "raid_level": "raid1", 00:24:58.080 "superblock": true, 00:24:58.080 "num_base_bdevs": 4, 00:24:58.080 "num_base_bdevs_discovered": 2, 00:24:58.080 "num_base_bdevs_operational": 2, 00:24:58.080 "base_bdevs_list": [ 00:24:58.080 { 00:24:58.080 "name": null, 00:24:58.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.080 "is_configured": false, 00:24:58.080 "data_offset": 2048, 00:24:58.080 "data_size": 63488 00:24:58.080 }, 00:24:58.080 { 00:24:58.080 "name": null, 00:24:58.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.080 "is_configured": false, 00:24:58.080 "data_offset": 2048, 00:24:58.080 "data_size": 63488 00:24:58.080 }, 00:24:58.080 { 00:24:58.080 "name": "BaseBdev3", 00:24:58.080 "uuid": "c28e1759-3209-5d18-8fb7-95713bb0274c", 00:24:58.080 "is_configured": true, 00:24:58.080 "data_offset": 2048, 00:24:58.080 "data_size": 63488 00:24:58.080 }, 00:24:58.080 { 00:24:58.080 "name": "BaseBdev4", 00:24:58.080 "uuid": "609c5b9e-20b2-5cd2-b88e-698186b12b30", 00:24:58.080 "is_configured": true, 00:24:58.080 "data_offset": 2048, 00:24:58.080 "data_size": 63488 00:24:58.080 } 00:24:58.080 ] 00:24:58.080 }' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2193016 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2193016 ']' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2193016 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2193016 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2193016' 00:24:58.080 killing process with pid 2193016 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2193016 00:24:58.080 Received shutdown signal, test time was about 60.000000 seconds 00:24:58.080 00:24:58.080 Latency(us) 00:24:58.080 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:58.080 =================================================================================================================== 00:24:58.080 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:58.080 [2024-07-15 13:43:37.410860] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:58.080 [2024-07-15 13:43:37.410975] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:58.080 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2193016 00:24:58.080 [2024-07-15 13:43:37.411034] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:58.080 [2024-07-15 13:43:37.411053] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a9cba0 name raid_bdev1, state offline 00:24:58.080 [2024-07-15 13:43:37.458584] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:58.339 13:43:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:58.339 00:24:58.339 real 0m38.188s 00:24:58.339 user 0m55.125s 00:24:58.339 sys 0m7.107s 00:24:58.339 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:58.339 13:43:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:58.339 ************************************ 00:24:58.339 END TEST raid_rebuild_test_sb 00:24:58.339 ************************************ 00:24:58.339 13:43:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:58.339 13:43:37 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:24:58.339 13:43:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:58.339 13:43:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:58.339 13:43:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:58.598 ************************************ 00:24:58.598 START TEST raid_rebuild_test_io 00:24:58.598 ************************************ 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2198344 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2198344 /var/tmp/spdk-raid.sock 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2198344 ']' 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:58.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:58.598 13:43:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:58.598 [2024-07-15 13:43:37.838174] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:24:58.598 [2024-07-15 13:43:37.838230] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2198344 ] 00:24:58.598 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:58.598 Zero copy mechanism will not be used. 00:24:58.598 [2024-07-15 13:43:37.954292] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:58.856 [2024-07-15 13:43:38.064572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:58.856 [2024-07-15 13:43:38.130661] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:58.856 [2024-07-15 13:43:38.130704] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:59.422 13:43:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:59.423 13:43:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:59.423 13:43:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:59.423 13:43:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:59.682 BaseBdev1_malloc 00:24:59.682 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:59.940 [2024-07-15 13:43:39.252013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:59.940 [2024-07-15 13:43:39.252066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:59.940 [2024-07-15 13:43:39.252091] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa11d40 00:24:59.940 [2024-07-15 13:43:39.252103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:59.940 [2024-07-15 13:43:39.253818] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:59.940 [2024-07-15 13:43:39.253847] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:59.940 BaseBdev1 00:24:59.940 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:59.940 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:00.198 BaseBdev2_malloc 00:25:00.198 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:00.457 [2024-07-15 13:43:39.734147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:00.457 [2024-07-15 13:43:39.734198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.457 [2024-07-15 13:43:39.734222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa12860 00:25:00.457 [2024-07-15 13:43:39.734235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.457 [2024-07-15 13:43:39.735651] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.457 [2024-07-15 13:43:39.735680] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:00.457 BaseBdev2 00:25:00.457 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:00.457 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:00.715 BaseBdev3_malloc 00:25:00.715 13:43:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:00.973 [2024-07-15 13:43:40.227999] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:00.973 [2024-07-15 13:43:40.228052] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.973 [2024-07-15 13:43:40.228074] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbbf8f0 00:25:00.973 [2024-07-15 13:43:40.228087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.973 [2024-07-15 13:43:40.229570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.973 [2024-07-15 13:43:40.229597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:00.973 BaseBdev3 00:25:00.973 13:43:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:00.973 13:43:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:01.232 BaseBdev4_malloc 00:25:01.232 13:43:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:01.490 [2024-07-15 13:43:40.717972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:01.490 [2024-07-15 13:43:40.718018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:01.490 [2024-07-15 13:43:40.718039] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbbead0 00:25:01.490 [2024-07-15 13:43:40.718052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:01.490 [2024-07-15 13:43:40.719472] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:01.490 [2024-07-15 13:43:40.719500] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:01.490 BaseBdev4 00:25:01.490 13:43:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:01.490 spare_malloc 00:25:01.749 13:43:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:01.749 spare_delay 00:25:01.749 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:02.007 [2024-07-15 13:43:41.376278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:02.007 [2024-07-15 13:43:41.376330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:02.007 [2024-07-15 13:43:41.376352] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbc35b0 00:25:02.007 [2024-07-15 13:43:41.376364] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:02.007 [2024-07-15 13:43:41.377881] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:02.007 [2024-07-15 13:43:41.377915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:02.007 spare 00:25:02.007 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:02.265 [2024-07-15 13:43:41.612936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:02.265 [2024-07-15 13:43:41.614205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:02.265 [2024-07-15 13:43:41.614263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:02.265 [2024-07-15 13:43:41.614309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:02.265 [2024-07-15 13:43:41.614392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb428a0 00:25:02.265 [2024-07-15 13:43:41.614402] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:02.265 [2024-07-15 13:43:41.614620] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbbce10 00:25:02.265 [2024-07-15 13:43:41.614772] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb428a0 00:25:02.265 [2024-07-15 13:43:41.614783] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb428a0 00:25:02.265 [2024-07-15 13:43:41.614897] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.265 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:02.265 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.266 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.524 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.524 "name": "raid_bdev1", 00:25:02.524 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:02.524 "strip_size_kb": 0, 00:25:02.524 "state": "online", 00:25:02.524 "raid_level": "raid1", 00:25:02.524 "superblock": false, 00:25:02.524 "num_base_bdevs": 4, 00:25:02.524 "num_base_bdevs_discovered": 4, 00:25:02.524 "num_base_bdevs_operational": 4, 00:25:02.524 "base_bdevs_list": [ 00:25:02.524 { 00:25:02.524 "name": "BaseBdev1", 00:25:02.524 "uuid": "a105fa2b-9b3f-584a-8494-16cc1b677d34", 00:25:02.524 "is_configured": true, 00:25:02.524 "data_offset": 0, 00:25:02.524 "data_size": 65536 00:25:02.524 }, 00:25:02.524 { 00:25:02.524 "name": "BaseBdev2", 00:25:02.524 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:02.524 "is_configured": true, 00:25:02.524 "data_offset": 0, 00:25:02.524 "data_size": 65536 00:25:02.524 }, 00:25:02.524 { 00:25:02.524 "name": "BaseBdev3", 00:25:02.524 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:02.524 "is_configured": true, 00:25:02.524 "data_offset": 0, 00:25:02.524 "data_size": 65536 00:25:02.524 }, 00:25:02.524 { 00:25:02.524 "name": "BaseBdev4", 00:25:02.524 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:02.524 "is_configured": true, 00:25:02.524 "data_offset": 0, 00:25:02.524 "data_size": 65536 00:25:02.524 } 00:25:02.524 ] 00:25:02.524 }' 00:25:02.524 13:43:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.524 13:43:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.090 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:03.090 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:03.348 [2024-07-15 13:43:42.712145] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:03.348 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:03.348 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:03.349 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.607 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:03.607 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:03.607 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:03.607 13:43:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:03.865 [2024-07-15 13:43:43.094961] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb48970 00:25:03.865 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:03.865 Zero copy mechanism will not be used. 00:25:03.865 Running I/O for 60 seconds... 00:25:03.865 [2024-07-15 13:43:43.212067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:03.865 [2024-07-15 13:43:43.220251] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb48970 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.865 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.124 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.124 "name": "raid_bdev1", 00:25:04.124 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:04.124 "strip_size_kb": 0, 00:25:04.124 "state": "online", 00:25:04.124 "raid_level": "raid1", 00:25:04.124 "superblock": false, 00:25:04.124 "num_base_bdevs": 4, 00:25:04.124 "num_base_bdevs_discovered": 3, 00:25:04.124 "num_base_bdevs_operational": 3, 00:25:04.124 "base_bdevs_list": [ 00:25:04.124 { 00:25:04.124 "name": null, 00:25:04.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.124 "is_configured": false, 00:25:04.124 "data_offset": 0, 00:25:04.124 "data_size": 65536 00:25:04.124 }, 00:25:04.124 { 00:25:04.124 "name": "BaseBdev2", 00:25:04.124 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:04.124 "is_configured": true, 00:25:04.124 "data_offset": 0, 00:25:04.124 "data_size": 65536 00:25:04.124 }, 00:25:04.124 { 00:25:04.124 "name": "BaseBdev3", 00:25:04.124 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:04.124 "is_configured": true, 00:25:04.124 "data_offset": 0, 00:25:04.124 "data_size": 65536 00:25:04.124 }, 00:25:04.124 { 00:25:04.124 "name": "BaseBdev4", 00:25:04.124 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:04.124 "is_configured": true, 00:25:04.124 "data_offset": 0, 00:25:04.124 "data_size": 65536 00:25:04.124 } 00:25:04.124 ] 00:25:04.124 }' 00:25:04.124 13:43:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.124 13:43:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:05.057 13:43:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:05.057 [2024-07-15 13:43:44.394323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.057 [2024-07-15 13:43:44.441299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x718fa0 00:25:05.057 [2024-07-15 13:43:44.443705] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.057 13:43:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:05.315 [2024-07-15 13:43:44.565315] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:05.315 [2024-07-15 13:43:44.565735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:05.315 [2024-07-15 13:43:44.707847] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:05.315 [2024-07-15 13:43:44.708516] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:05.882 [2024-07-15 13:43:45.066152] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.140 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.140 [2024-07-15 13:43:45.465137] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:06.398 [2024-07-15 13:43:45.691137] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.398 "name": "raid_bdev1", 00:25:06.398 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:06.398 "strip_size_kb": 0, 00:25:06.398 "state": "online", 00:25:06.398 "raid_level": "raid1", 00:25:06.398 "superblock": false, 00:25:06.398 "num_base_bdevs": 4, 00:25:06.398 "num_base_bdevs_discovered": 4, 00:25:06.398 "num_base_bdevs_operational": 4, 00:25:06.398 "process": { 00:25:06.398 "type": "rebuild", 00:25:06.398 "target": "spare", 00:25:06.398 "progress": { 00:25:06.398 "blocks": 14336, 00:25:06.398 "percent": 21 00:25:06.398 } 00:25:06.398 }, 00:25:06.398 "base_bdevs_list": [ 00:25:06.398 { 00:25:06.398 "name": "spare", 00:25:06.398 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:06.398 "is_configured": true, 00:25:06.398 "data_offset": 0, 00:25:06.398 "data_size": 65536 00:25:06.398 }, 00:25:06.398 { 00:25:06.398 "name": "BaseBdev2", 00:25:06.398 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:06.398 "is_configured": true, 00:25:06.398 "data_offset": 0, 00:25:06.398 "data_size": 65536 00:25:06.398 }, 00:25:06.398 { 00:25:06.398 "name": "BaseBdev3", 00:25:06.398 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:06.398 "is_configured": true, 00:25:06.398 "data_offset": 0, 00:25:06.398 "data_size": 65536 00:25:06.398 }, 00:25:06.398 { 00:25:06.398 "name": "BaseBdev4", 00:25:06.398 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:06.398 "is_configured": true, 00:25:06.398 "data_offset": 0, 00:25:06.398 "data_size": 65536 00:25:06.398 } 00:25:06.398 ] 00:25:06.398 }' 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.398 13:43:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:06.656 [2024-07-15 13:43:46.039142] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.656 [2024-07-15 13:43:46.050032] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:06.914 [2024-07-15 13:43:46.147393] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:06.914 [2024-07-15 13:43:46.161513] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.914 [2024-07-15 13:43:46.161548] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.914 [2024-07-15 13:43:46.161559] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:06.914 [2024-07-15 13:43:46.175708] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb48970 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.914 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.172 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.172 "name": "raid_bdev1", 00:25:07.172 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:07.172 "strip_size_kb": 0, 00:25:07.172 "state": "online", 00:25:07.172 "raid_level": "raid1", 00:25:07.172 "superblock": false, 00:25:07.172 "num_base_bdevs": 4, 00:25:07.172 "num_base_bdevs_discovered": 3, 00:25:07.172 "num_base_bdevs_operational": 3, 00:25:07.172 "base_bdevs_list": [ 00:25:07.172 { 00:25:07.172 "name": null, 00:25:07.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.172 "is_configured": false, 00:25:07.172 "data_offset": 0, 00:25:07.172 "data_size": 65536 00:25:07.172 }, 00:25:07.172 { 00:25:07.172 "name": "BaseBdev2", 00:25:07.172 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:07.172 "is_configured": true, 00:25:07.172 "data_offset": 0, 00:25:07.172 "data_size": 65536 00:25:07.172 }, 00:25:07.172 { 00:25:07.172 "name": "BaseBdev3", 00:25:07.172 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:07.172 "is_configured": true, 00:25:07.172 "data_offset": 0, 00:25:07.172 "data_size": 65536 00:25:07.172 }, 00:25:07.172 { 00:25:07.172 "name": "BaseBdev4", 00:25:07.172 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:07.172 "is_configured": true, 00:25:07.172 "data_offset": 0, 00:25:07.172 "data_size": 65536 00:25:07.172 } 00:25:07.172 ] 00:25:07.172 }' 00:25:07.172 13:43:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.172 13:43:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.738 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.996 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.996 "name": "raid_bdev1", 00:25:07.996 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:07.996 "strip_size_kb": 0, 00:25:07.996 "state": "online", 00:25:07.996 "raid_level": "raid1", 00:25:07.996 "superblock": false, 00:25:07.996 "num_base_bdevs": 4, 00:25:07.996 "num_base_bdevs_discovered": 3, 00:25:07.996 "num_base_bdevs_operational": 3, 00:25:07.996 "base_bdevs_list": [ 00:25:07.996 { 00:25:07.996 "name": null, 00:25:07.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.996 "is_configured": false, 00:25:07.996 "data_offset": 0, 00:25:07.996 "data_size": 65536 00:25:07.996 }, 00:25:07.996 { 00:25:07.996 "name": "BaseBdev2", 00:25:07.996 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:07.996 "is_configured": true, 00:25:07.996 "data_offset": 0, 00:25:07.996 "data_size": 65536 00:25:07.996 }, 00:25:07.996 { 00:25:07.996 "name": "BaseBdev3", 00:25:07.996 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:07.996 "is_configured": true, 00:25:07.996 "data_offset": 0, 00:25:07.996 "data_size": 65536 00:25:07.996 }, 00:25:07.996 { 00:25:07.996 "name": "BaseBdev4", 00:25:07.996 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:07.996 "is_configured": true, 00:25:07.996 "data_offset": 0, 00:25:07.996 "data_size": 65536 00:25:07.996 } 00:25:07.996 ] 00:25:07.996 }' 00:25:07.996 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.254 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:08.254 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.254 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.254 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.512 [2024-07-15 13:43:47.703395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.512 13:43:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:08.512 [2024-07-15 13:43:47.788588] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb49150 00:25:08.512 [2024-07-15 13:43:47.790155] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.512 [2024-07-15 13:43:47.899139] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:08.512 [2024-07-15 13:43:47.899444] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:08.769 [2024-07-15 13:43:48.143658] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:08.769 [2024-07-15 13:43:48.143857] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:09.334 [2024-07-15 13:43:48.649135] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.593 13:43:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.593 [2024-07-15 13:43:48.892354] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:09.593 [2024-07-15 13:43:49.013827] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:09.593 [2024-07-15 13:43:49.014434] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.850 "name": "raid_bdev1", 00:25:09.850 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:09.850 "strip_size_kb": 0, 00:25:09.850 "state": "online", 00:25:09.850 "raid_level": "raid1", 00:25:09.850 "superblock": false, 00:25:09.850 "num_base_bdevs": 4, 00:25:09.850 "num_base_bdevs_discovered": 4, 00:25:09.850 "num_base_bdevs_operational": 4, 00:25:09.850 "process": { 00:25:09.850 "type": "rebuild", 00:25:09.850 "target": "spare", 00:25:09.850 "progress": { 00:25:09.850 "blocks": 14336, 00:25:09.850 "percent": 21 00:25:09.850 } 00:25:09.850 }, 00:25:09.850 "base_bdevs_list": [ 00:25:09.850 { 00:25:09.850 "name": "spare", 00:25:09.850 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:09.850 "is_configured": true, 00:25:09.850 "data_offset": 0, 00:25:09.850 "data_size": 65536 00:25:09.850 }, 00:25:09.850 { 00:25:09.850 "name": "BaseBdev2", 00:25:09.850 "uuid": "d04c7c5d-4950-554b-8aba-175a673c54c9", 00:25:09.850 "is_configured": true, 00:25:09.850 "data_offset": 0, 00:25:09.850 "data_size": 65536 00:25:09.850 }, 00:25:09.850 { 00:25:09.850 "name": "BaseBdev3", 00:25:09.850 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:09.850 "is_configured": true, 00:25:09.850 "data_offset": 0, 00:25:09.850 "data_size": 65536 00:25:09.850 }, 00:25:09.850 { 00:25:09.850 "name": "BaseBdev4", 00:25:09.850 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:09.850 "is_configured": true, 00:25:09.850 "data_offset": 0, 00:25:09.850 "data_size": 65536 00:25:09.850 } 00:25:09.850 ] 00:25:09.850 }' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:09.850 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:10.108 [2024-07-15 13:43:49.344006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:10.108 [2024-07-15 13:43:49.460320] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xb48970 00:25:10.108 [2024-07-15 13:43:49.460352] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xb49150 00:25:10.108 [2024-07-15 13:43:49.460396] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:10.108 [2024-07-15 13:43:49.479367] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.108 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.365 [2024-07-15 13:43:49.718946] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:10.365 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.365 "name": "raid_bdev1", 00:25:10.365 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:10.365 "strip_size_kb": 0, 00:25:10.365 "state": "online", 00:25:10.365 "raid_level": "raid1", 00:25:10.365 "superblock": false, 00:25:10.365 "num_base_bdevs": 4, 00:25:10.365 "num_base_bdevs_discovered": 3, 00:25:10.365 "num_base_bdevs_operational": 3, 00:25:10.365 "process": { 00:25:10.365 "type": "rebuild", 00:25:10.365 "target": "spare", 00:25:10.365 "progress": { 00:25:10.365 "blocks": 20480, 00:25:10.365 "percent": 31 00:25:10.365 } 00:25:10.365 }, 00:25:10.365 "base_bdevs_list": [ 00:25:10.365 { 00:25:10.365 "name": "spare", 00:25:10.365 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:10.365 "is_configured": true, 00:25:10.365 "data_offset": 0, 00:25:10.365 "data_size": 65536 00:25:10.365 }, 00:25:10.365 { 00:25:10.365 "name": null, 00:25:10.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.365 "is_configured": false, 00:25:10.365 "data_offset": 0, 00:25:10.365 "data_size": 65536 00:25:10.365 }, 00:25:10.365 { 00:25:10.365 "name": "BaseBdev3", 00:25:10.365 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:10.365 "is_configured": true, 00:25:10.365 "data_offset": 0, 00:25:10.365 "data_size": 65536 00:25:10.365 }, 00:25:10.365 { 00:25:10.365 "name": "BaseBdev4", 00:25:10.365 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:10.365 "is_configured": true, 00:25:10.365 "data_offset": 0, 00:25:10.365 "data_size": 65536 00:25:10.365 } 00:25:10.365 ] 00:25:10.365 }' 00:25:10.365 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.365 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.365 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.622 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.622 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=933 00:25:10.622 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.623 13:43:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.879 "name": "raid_bdev1", 00:25:10.879 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:10.879 "strip_size_kb": 0, 00:25:10.879 "state": "online", 00:25:10.879 "raid_level": "raid1", 00:25:10.879 "superblock": false, 00:25:10.879 "num_base_bdevs": 4, 00:25:10.879 "num_base_bdevs_discovered": 3, 00:25:10.879 "num_base_bdevs_operational": 3, 00:25:10.879 "process": { 00:25:10.879 "type": "rebuild", 00:25:10.879 "target": "spare", 00:25:10.879 "progress": { 00:25:10.879 "blocks": 24576, 00:25:10.879 "percent": 37 00:25:10.879 } 00:25:10.879 }, 00:25:10.879 "base_bdevs_list": [ 00:25:10.879 { 00:25:10.879 "name": "spare", 00:25:10.879 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:10.879 "is_configured": true, 00:25:10.879 "data_offset": 0, 00:25:10.879 "data_size": 65536 00:25:10.879 }, 00:25:10.879 { 00:25:10.879 "name": null, 00:25:10.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.879 "is_configured": false, 00:25:10.879 "data_offset": 0, 00:25:10.879 "data_size": 65536 00:25:10.879 }, 00:25:10.879 { 00:25:10.879 "name": "BaseBdev3", 00:25:10.879 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:10.879 "is_configured": true, 00:25:10.879 "data_offset": 0, 00:25:10.879 "data_size": 65536 00:25:10.879 }, 00:25:10.879 { 00:25:10.879 "name": "BaseBdev4", 00:25:10.879 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:10.879 "is_configured": true, 00:25:10.879 "data_offset": 0, 00:25:10.879 "data_size": 65536 00:25:10.879 } 00:25:10.879 ] 00:25:10.879 }' 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.879 13:43:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:11.442 [2024-07-15 13:43:50.806961] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:11.442 [2024-07-15 13:43:50.807605] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.005 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.005 [2024-07-15 13:43:51.241670] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.262 "name": "raid_bdev1", 00:25:12.262 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:12.262 "strip_size_kb": 0, 00:25:12.262 "state": "online", 00:25:12.262 "raid_level": "raid1", 00:25:12.262 "superblock": false, 00:25:12.262 "num_base_bdevs": 4, 00:25:12.262 "num_base_bdevs_discovered": 3, 00:25:12.262 "num_base_bdevs_operational": 3, 00:25:12.262 "process": { 00:25:12.262 "type": "rebuild", 00:25:12.262 "target": "spare", 00:25:12.262 "progress": { 00:25:12.262 "blocks": 45056, 00:25:12.262 "percent": 68 00:25:12.262 } 00:25:12.262 }, 00:25:12.262 "base_bdevs_list": [ 00:25:12.262 { 00:25:12.262 "name": "spare", 00:25:12.262 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:12.262 "is_configured": true, 00:25:12.262 "data_offset": 0, 00:25:12.262 "data_size": 65536 00:25:12.262 }, 00:25:12.262 { 00:25:12.262 "name": null, 00:25:12.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.262 "is_configured": false, 00:25:12.262 "data_offset": 0, 00:25:12.262 "data_size": 65536 00:25:12.262 }, 00:25:12.262 { 00:25:12.262 "name": "BaseBdev3", 00:25:12.262 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:12.262 "is_configured": true, 00:25:12.262 "data_offset": 0, 00:25:12.262 "data_size": 65536 00:25:12.262 }, 00:25:12.262 { 00:25:12.262 "name": "BaseBdev4", 00:25:12.262 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:12.262 "is_configured": true, 00:25:12.262 "data_offset": 0, 00:25:12.262 "data_size": 65536 00:25:12.262 } 00:25:12.262 ] 00:25:12.262 }' 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.262 [2024-07-15 13:43:51.444206] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.262 13:43:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:12.826 [2024-07-15 13:43:52.015307] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:25:12.826 [2024-07-15 13:43:52.134963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.391 [2024-07-15 13:43:52.588532] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:13.391 [2024-07-15 13:43:52.688767] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:13.391 [2024-07-15 13:43:52.689980] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.391 "name": "raid_bdev1", 00:25:13.391 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:13.391 "strip_size_kb": 0, 00:25:13.391 "state": "online", 00:25:13.391 "raid_level": "raid1", 00:25:13.391 "superblock": false, 00:25:13.391 "num_base_bdevs": 4, 00:25:13.391 "num_base_bdevs_discovered": 3, 00:25:13.391 "num_base_bdevs_operational": 3, 00:25:13.391 "base_bdevs_list": [ 00:25:13.391 { 00:25:13.391 "name": "spare", 00:25:13.391 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:13.391 "is_configured": true, 00:25:13.391 "data_offset": 0, 00:25:13.391 "data_size": 65536 00:25:13.391 }, 00:25:13.391 { 00:25:13.391 "name": null, 00:25:13.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.391 "is_configured": false, 00:25:13.391 "data_offset": 0, 00:25:13.391 "data_size": 65536 00:25:13.391 }, 00:25:13.391 { 00:25:13.391 "name": "BaseBdev3", 00:25:13.391 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:13.391 "is_configured": true, 00:25:13.391 "data_offset": 0, 00:25:13.391 "data_size": 65536 00:25:13.391 }, 00:25:13.391 { 00:25:13.391 "name": "BaseBdev4", 00:25:13.391 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:13.391 "is_configured": true, 00:25:13.391 "data_offset": 0, 00:25:13.391 "data_size": 65536 00:25:13.391 } 00:25:13.391 ] 00:25:13.391 }' 00:25:13.391 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.649 13:43:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.906 "name": "raid_bdev1", 00:25:13.906 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:13.906 "strip_size_kb": 0, 00:25:13.906 "state": "online", 00:25:13.906 "raid_level": "raid1", 00:25:13.906 "superblock": false, 00:25:13.906 "num_base_bdevs": 4, 00:25:13.906 "num_base_bdevs_discovered": 3, 00:25:13.906 "num_base_bdevs_operational": 3, 00:25:13.906 "base_bdevs_list": [ 00:25:13.906 { 00:25:13.906 "name": "spare", 00:25:13.906 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:13.906 "is_configured": true, 00:25:13.906 "data_offset": 0, 00:25:13.906 "data_size": 65536 00:25:13.906 }, 00:25:13.906 { 00:25:13.906 "name": null, 00:25:13.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.906 "is_configured": false, 00:25:13.906 "data_offset": 0, 00:25:13.906 "data_size": 65536 00:25:13.906 }, 00:25:13.906 { 00:25:13.906 "name": "BaseBdev3", 00:25:13.906 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:13.906 "is_configured": true, 00:25:13.906 "data_offset": 0, 00:25:13.906 "data_size": 65536 00:25:13.906 }, 00:25:13.906 { 00:25:13.906 "name": "BaseBdev4", 00:25:13.906 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:13.906 "is_configured": true, 00:25:13.906 "data_offset": 0, 00:25:13.906 "data_size": 65536 00:25:13.906 } 00:25:13.906 ] 00:25:13.906 }' 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.906 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.907 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.163 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.163 "name": "raid_bdev1", 00:25:14.164 "uuid": "0dbcb32f-410a-433d-ad9f-786cd10b0f00", 00:25:14.164 "strip_size_kb": 0, 00:25:14.164 "state": "online", 00:25:14.164 "raid_level": "raid1", 00:25:14.164 "superblock": false, 00:25:14.164 "num_base_bdevs": 4, 00:25:14.164 "num_base_bdevs_discovered": 3, 00:25:14.164 "num_base_bdevs_operational": 3, 00:25:14.164 "base_bdevs_list": [ 00:25:14.164 { 00:25:14.164 "name": "spare", 00:25:14.164 "uuid": "53d2f8c3-3dc8-5422-8f4f-bb1a1b50d18b", 00:25:14.164 "is_configured": true, 00:25:14.164 "data_offset": 0, 00:25:14.164 "data_size": 65536 00:25:14.164 }, 00:25:14.164 { 00:25:14.164 "name": null, 00:25:14.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.164 "is_configured": false, 00:25:14.164 "data_offset": 0, 00:25:14.164 "data_size": 65536 00:25:14.164 }, 00:25:14.164 { 00:25:14.164 "name": "BaseBdev3", 00:25:14.164 "uuid": "ab09a4cd-eda1-5808-8a4f-fb7a6fe933af", 00:25:14.164 "is_configured": true, 00:25:14.164 "data_offset": 0, 00:25:14.164 "data_size": 65536 00:25:14.164 }, 00:25:14.164 { 00:25:14.164 "name": "BaseBdev4", 00:25:14.164 "uuid": "150048bd-8e3c-5e65-b590-d92310759c2a", 00:25:14.164 "is_configured": true, 00:25:14.164 "data_offset": 0, 00:25:14.164 "data_size": 65536 00:25:14.164 } 00:25:14.164 ] 00:25:14.164 }' 00:25:14.164 13:43:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.164 13:43:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:14.815 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:15.073 [2024-07-15 13:43:54.283645] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:15.073 [2024-07-15 13:43:54.283685] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:15.073 00:25:15.073 Latency(us) 00:25:15.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.073 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:15.073 raid_bdev1 : 11.24 92.62 277.87 0.00 0.00 14859.81 299.19 120358.29 00:25:15.073 =================================================================================================================== 00:25:15.073 Total : 92.62 277.87 0.00 0.00 14859.81 299.19 120358.29 00:25:15.073 [2024-07-15 13:43:54.367848] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:15.073 [2024-07-15 13:43:54.367880] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:15.073 [2024-07-15 13:43:54.367982] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:15.073 [2024-07-15 13:43:54.367995] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb428a0 name raid_bdev1, state offline 00:25:15.073 0 00:25:15.073 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.073 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.331 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:15.589 /dev/nbd0 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.589 1+0 records in 00:25:15.589 1+0 records out 00:25:15.589 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239953 s, 17.1 MB/s 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.589 13:43:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:15.848 /dev/nbd1 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.848 1+0 records in 00:25:15.848 1+0 records out 00:25:15.848 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299949 s, 13.7 MB/s 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.848 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:16.105 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:16.105 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:16.105 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:16.106 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:16.106 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:16.106 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.106 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:16.363 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:16.364 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:16.622 /dev/nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:16.622 1+0 records in 00:25:16.622 1+0 records out 00:25:16.622 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261257 s, 15.7 MB/s 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.622 13:43:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:16.880 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2198344 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2198344 ']' 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2198344 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:17.137 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2198344 00:25:17.395 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:17.395 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:17.395 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2198344' 00:25:17.395 killing process with pid 2198344 00:25:17.395 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2198344 00:25:17.395 Received shutdown signal, test time was about 13.434961 seconds 00:25:17.395 00:25:17.395 Latency(us) 00:25:17.395 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:17.395 =================================================================================================================== 00:25:17.395 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:17.395 [2024-07-15 13:43:56.565155] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:17.395 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2198344 00:25:17.395 [2024-07-15 13:43:56.609804] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:17.653 00:25:17.653 real 0m19.069s 00:25:17.653 user 0m29.501s 00:25:17.653 sys 0m3.474s 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:17.653 ************************************ 00:25:17.653 END TEST raid_rebuild_test_io 00:25:17.653 ************************************ 00:25:17.653 13:43:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:17.653 13:43:56 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:17.653 13:43:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:17.653 13:43:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:17.653 13:43:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:17.653 ************************************ 00:25:17.653 START TEST raid_rebuild_test_sb_io 00:25:17.653 ************************************ 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:17.653 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2201037 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2201037 /var/tmp/spdk-raid.sock 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2201037 ']' 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:17.654 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:17.654 13:43:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:17.654 [2024-07-15 13:43:57.037846] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:25:17.654 [2024-07-15 13:43:57.037989] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2201037 ] 00:25:17.654 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:17.654 Zero copy mechanism will not be used. 00:25:17.912 [2024-07-15 13:43:57.235613] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:17.912 [2024-07-15 13:43:57.332402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:18.169 [2024-07-15 13:43:57.393663] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:18.169 [2024-07-15 13:43:57.393701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:18.736 13:43:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:18.736 13:43:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:18.736 13:43:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:18.736 13:43:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:18.736 BaseBdev1_malloc 00:25:18.736 13:43:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:19.302 [2024-07-15 13:43:58.638443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:19.302 [2024-07-15 13:43:58.638494] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.302 [2024-07-15 13:43:58.638521] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2dd40 00:25:19.302 [2024-07-15 13:43:58.638541] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.302 [2024-07-15 13:43:58.640396] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.302 [2024-07-15 13:43:58.640426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:19.302 BaseBdev1 00:25:19.302 13:43:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:19.302 13:43:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:19.560 BaseBdev2_malloc 00:25:19.560 13:43:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:19.818 [2024-07-15 13:43:59.137363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:19.818 [2024-07-15 13:43:59.137412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:19.818 [2024-07-15 13:43:59.137437] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2e860 00:25:19.818 [2024-07-15 13:43:59.137450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:19.818 [2024-07-15 13:43:59.139000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:19.818 [2024-07-15 13:43:59.139028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:19.818 BaseBdev2 00:25:19.818 13:43:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:19.818 13:43:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:20.384 BaseBdev3_malloc 00:25:20.384 13:43:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:20.641 [2024-07-15 13:43:59.887934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:20.641 [2024-07-15 13:43:59.887980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:20.641 [2024-07-15 13:43:59.888001] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfdb8f0 00:25:20.641 [2024-07-15 13:43:59.888015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:20.641 [2024-07-15 13:43:59.889555] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:20.641 [2024-07-15 13:43:59.889583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:20.641 BaseBdev3 00:25:20.641 13:43:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:20.641 13:43:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:21.206 BaseBdev4_malloc 00:25:21.207 13:44:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:21.464 [2024-07-15 13:44:00.643878] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:21.464 [2024-07-15 13:44:00.643932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.464 [2024-07-15 13:44:00.643955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfdaad0 00:25:21.464 [2024-07-15 13:44:00.643968] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.464 [2024-07-15 13:44:00.645546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.464 [2024-07-15 13:44:00.645573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:21.464 BaseBdev4 00:25:21.464 13:44:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:22.030 spare_malloc 00:25:22.030 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:22.030 spare_delay 00:25:22.030 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:22.287 [2024-07-15 13:44:01.558892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:22.287 [2024-07-15 13:44:01.558943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.287 [2024-07-15 13:44:01.558964] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfdf5b0 00:25:22.287 [2024-07-15 13:44:01.558978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.287 [2024-07-15 13:44:01.560544] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.287 [2024-07-15 13:44:01.560571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:22.287 spare 00:25:22.287 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:22.545 [2024-07-15 13:44:01.715352] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:22.545 [2024-07-15 13:44:01.716639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:22.545 [2024-07-15 13:44:01.716695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:22.545 [2024-07-15 13:44:01.716741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:22.545 [2024-07-15 13:44:01.716946] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf5e8a0 00:25:22.545 [2024-07-15 13:44:01.716958] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:22.545 [2024-07-15 13:44:01.717161] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd8e10 00:25:22.545 [2024-07-15 13:44:01.717314] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf5e8a0 00:25:22.545 [2024-07-15 13:44:01.717325] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf5e8a0 00:25:22.545 [2024-07-15 13:44:01.717422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.545 13:44:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.110 13:44:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.110 "name": "raid_bdev1", 00:25:23.110 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:23.110 "strip_size_kb": 0, 00:25:23.110 "state": "online", 00:25:23.110 "raid_level": "raid1", 00:25:23.110 "superblock": true, 00:25:23.110 "num_base_bdevs": 4, 00:25:23.110 "num_base_bdevs_discovered": 4, 00:25:23.110 "num_base_bdevs_operational": 4, 00:25:23.110 "base_bdevs_list": [ 00:25:23.110 { 00:25:23.110 "name": "BaseBdev1", 00:25:23.110 "uuid": "8af612bd-fdff-5c1b-b52f-9341c78311c8", 00:25:23.110 "is_configured": true, 00:25:23.110 "data_offset": 2048, 00:25:23.110 "data_size": 63488 00:25:23.110 }, 00:25:23.110 { 00:25:23.110 "name": "BaseBdev2", 00:25:23.110 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:23.110 "is_configured": true, 00:25:23.110 "data_offset": 2048, 00:25:23.110 "data_size": 63488 00:25:23.110 }, 00:25:23.110 { 00:25:23.110 "name": "BaseBdev3", 00:25:23.110 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:23.110 "is_configured": true, 00:25:23.110 "data_offset": 2048, 00:25:23.110 "data_size": 63488 00:25:23.110 }, 00:25:23.110 { 00:25:23.110 "name": "BaseBdev4", 00:25:23.110 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:23.110 "is_configured": true, 00:25:23.110 "data_offset": 2048, 00:25:23.110 "data_size": 63488 00:25:23.110 } 00:25:23.110 ] 00:25:23.110 }' 00:25:23.110 13:44:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.110 13:44:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:23.367 13:44:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:23.367 13:44:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:23.625 [2024-07-15 13:44:03.003056] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:23.625 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:23.625 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.625 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:23.882 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:23.882 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:23.882 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:23.882 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:24.140 [2024-07-15 13:44:03.381820] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2d670 00:25:24.140 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:24.140 Zero copy mechanism will not be used. 00:25:24.140 Running I/O for 60 seconds... 00:25:24.140 [2024-07-15 13:44:03.515010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:24.140 [2024-07-15 13:44:03.515243] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe2d670 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.140 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.706 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:24.706 "name": "raid_bdev1", 00:25:24.706 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:24.706 "strip_size_kb": 0, 00:25:24.706 "state": "online", 00:25:24.706 "raid_level": "raid1", 00:25:24.706 "superblock": true, 00:25:24.706 "num_base_bdevs": 4, 00:25:24.706 "num_base_bdevs_discovered": 3, 00:25:24.706 "num_base_bdevs_operational": 3, 00:25:24.706 "base_bdevs_list": [ 00:25:24.706 { 00:25:24.706 "name": null, 00:25:24.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.706 "is_configured": false, 00:25:24.706 "data_offset": 2048, 00:25:24.706 "data_size": 63488 00:25:24.706 }, 00:25:24.706 { 00:25:24.706 "name": "BaseBdev2", 00:25:24.706 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:24.706 "is_configured": true, 00:25:24.706 "data_offset": 2048, 00:25:24.706 "data_size": 63488 00:25:24.706 }, 00:25:24.706 { 00:25:24.706 "name": "BaseBdev3", 00:25:24.706 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:24.706 "is_configured": true, 00:25:24.706 "data_offset": 2048, 00:25:24.706 "data_size": 63488 00:25:24.706 }, 00:25:24.706 { 00:25:24.706 "name": "BaseBdev4", 00:25:24.706 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:24.706 "is_configured": true, 00:25:24.706 "data_offset": 2048, 00:25:24.706 "data_size": 63488 00:25:24.706 } 00:25:24.706 ] 00:25:24.706 }' 00:25:24.706 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:24.706 13:44:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.272 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:25.272 [2024-07-15 13:44:04.559727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:25.272 13:44:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:25.272 [2024-07-15 13:44:04.624795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf60b40 00:25:25.272 [2024-07-15 13:44:04.627189] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:25.530 [2024-07-15 13:44:04.757445] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:25.530 [2024-07-15 13:44:04.757751] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:25.787 [2024-07-15 13:44:04.971560] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:25.787 [2024-07-15 13:44:04.971802] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:26.044 [2024-07-15 13:44:05.244338] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:26.044 [2024-07-15 13:44:05.467724] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:26.044 [2024-07-15 13:44:05.468375] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.302 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.560 [2024-07-15 13:44:05.803986] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.560 "name": "raid_bdev1", 00:25:26.560 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:26.560 "strip_size_kb": 0, 00:25:26.560 "state": "online", 00:25:26.560 "raid_level": "raid1", 00:25:26.560 "superblock": true, 00:25:26.560 "num_base_bdevs": 4, 00:25:26.560 "num_base_bdevs_discovered": 4, 00:25:26.560 "num_base_bdevs_operational": 4, 00:25:26.560 "process": { 00:25:26.560 "type": "rebuild", 00:25:26.560 "target": "spare", 00:25:26.560 "progress": { 00:25:26.560 "blocks": 14336, 00:25:26.560 "percent": 22 00:25:26.560 } 00:25:26.560 }, 00:25:26.560 "base_bdevs_list": [ 00:25:26.560 { 00:25:26.560 "name": "spare", 00:25:26.560 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:26.560 "is_configured": true, 00:25:26.560 "data_offset": 2048, 00:25:26.560 "data_size": 63488 00:25:26.560 }, 00:25:26.560 { 00:25:26.560 "name": "BaseBdev2", 00:25:26.560 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:26.560 "is_configured": true, 00:25:26.560 "data_offset": 2048, 00:25:26.560 "data_size": 63488 00:25:26.560 }, 00:25:26.560 { 00:25:26.560 "name": "BaseBdev3", 00:25:26.560 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:26.560 "is_configured": true, 00:25:26.560 "data_offset": 2048, 00:25:26.560 "data_size": 63488 00:25:26.560 }, 00:25:26.560 { 00:25:26.560 "name": "BaseBdev4", 00:25:26.560 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:26.560 "is_configured": true, 00:25:26.560 "data_offset": 2048, 00:25:26.560 "data_size": 63488 00:25:26.560 } 00:25:26.560 ] 00:25:26.560 }' 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:26.560 13:44:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:26.817 [2024-07-15 13:44:06.006935] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:26.817 [2024-07-15 13:44:06.007202] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:26.817 [2024-07-15 13:44:06.165366] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.074 [2024-07-15 13:44:06.269354] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:27.074 [2024-07-15 13:44:06.282857] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.074 [2024-07-15 13:44:06.282892] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.074 [2024-07-15 13:44:06.282904] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:27.074 [2024-07-15 13:44:06.306009] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe2d670 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.074 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.331 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.331 "name": "raid_bdev1", 00:25:27.331 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:27.331 "strip_size_kb": 0, 00:25:27.331 "state": "online", 00:25:27.331 "raid_level": "raid1", 00:25:27.331 "superblock": true, 00:25:27.331 "num_base_bdevs": 4, 00:25:27.331 "num_base_bdevs_discovered": 3, 00:25:27.331 "num_base_bdevs_operational": 3, 00:25:27.331 "base_bdevs_list": [ 00:25:27.331 { 00:25:27.331 "name": null, 00:25:27.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.331 "is_configured": false, 00:25:27.331 "data_offset": 2048, 00:25:27.331 "data_size": 63488 00:25:27.331 }, 00:25:27.331 { 00:25:27.331 "name": "BaseBdev2", 00:25:27.331 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:27.331 "is_configured": true, 00:25:27.331 "data_offset": 2048, 00:25:27.331 "data_size": 63488 00:25:27.331 }, 00:25:27.331 { 00:25:27.331 "name": "BaseBdev3", 00:25:27.331 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:27.331 "is_configured": true, 00:25:27.331 "data_offset": 2048, 00:25:27.331 "data_size": 63488 00:25:27.331 }, 00:25:27.331 { 00:25:27.331 "name": "BaseBdev4", 00:25:27.331 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:27.331 "is_configured": true, 00:25:27.331 "data_offset": 2048, 00:25:27.331 "data_size": 63488 00:25:27.331 } 00:25:27.331 ] 00:25:27.331 }' 00:25:27.331 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.331 13:44:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.937 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.211 "name": "raid_bdev1", 00:25:28.211 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:28.211 "strip_size_kb": 0, 00:25:28.211 "state": "online", 00:25:28.211 "raid_level": "raid1", 00:25:28.211 "superblock": true, 00:25:28.211 "num_base_bdevs": 4, 00:25:28.211 "num_base_bdevs_discovered": 3, 00:25:28.211 "num_base_bdevs_operational": 3, 00:25:28.211 "base_bdevs_list": [ 00:25:28.211 { 00:25:28.211 "name": null, 00:25:28.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.211 "is_configured": false, 00:25:28.211 "data_offset": 2048, 00:25:28.211 "data_size": 63488 00:25:28.211 }, 00:25:28.211 { 00:25:28.211 "name": "BaseBdev2", 00:25:28.211 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:28.211 "is_configured": true, 00:25:28.211 "data_offset": 2048, 00:25:28.211 "data_size": 63488 00:25:28.211 }, 00:25:28.211 { 00:25:28.211 "name": "BaseBdev3", 00:25:28.211 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:28.211 "is_configured": true, 00:25:28.211 "data_offset": 2048, 00:25:28.211 "data_size": 63488 00:25:28.211 }, 00:25:28.211 { 00:25:28.211 "name": "BaseBdev4", 00:25:28.211 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:28.211 "is_configured": true, 00:25:28.211 "data_offset": 2048, 00:25:28.211 "data_size": 63488 00:25:28.211 } 00:25:28.211 ] 00:25:28.211 }' 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:28.211 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:28.468 [2024-07-15 13:44:07.798081] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:28.468 13:44:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:28.468 [2024-07-15 13:44:07.873593] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf628f0 00:25:28.468 [2024-07-15 13:44:07.875139] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:28.725 [2024-07-15 13:44:08.015131] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:28.725 [2024-07-15 13:44:08.015693] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:28.982 [2024-07-15 13:44:08.241310] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:29.239 [2024-07-15 13:44:08.525280] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.496 13:44:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.754 [2024-07-15 13:44:08.955113] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:29.754 [2024-07-15 13:44:08.955295] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:29.754 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.754 "name": "raid_bdev1", 00:25:29.754 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:29.754 "strip_size_kb": 0, 00:25:29.754 "state": "online", 00:25:29.754 "raid_level": "raid1", 00:25:29.754 "superblock": true, 00:25:29.754 "num_base_bdevs": 4, 00:25:29.754 "num_base_bdevs_discovered": 4, 00:25:29.754 "num_base_bdevs_operational": 4, 00:25:29.754 "process": { 00:25:29.754 "type": "rebuild", 00:25:29.754 "target": "spare", 00:25:29.754 "progress": { 00:25:29.754 "blocks": 18432, 00:25:29.754 "percent": 29 00:25:29.754 } 00:25:29.754 }, 00:25:29.754 "base_bdevs_list": [ 00:25:29.754 { 00:25:29.754 "name": "spare", 00:25:29.754 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:29.754 "is_configured": true, 00:25:29.754 "data_offset": 2048, 00:25:29.754 "data_size": 63488 00:25:29.754 }, 00:25:29.754 { 00:25:29.754 "name": "BaseBdev2", 00:25:29.754 "uuid": "36e9f256-04bd-500e-89a9-24214e023b8d", 00:25:29.754 "is_configured": true, 00:25:29.754 "data_offset": 2048, 00:25:29.754 "data_size": 63488 00:25:29.754 }, 00:25:29.754 { 00:25:29.754 "name": "BaseBdev3", 00:25:29.754 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:29.754 "is_configured": true, 00:25:29.754 "data_offset": 2048, 00:25:29.754 "data_size": 63488 00:25:29.754 }, 00:25:29.754 { 00:25:29.754 "name": "BaseBdev4", 00:25:29.754 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:29.754 "is_configured": true, 00:25:29.754 "data_offset": 2048, 00:25:29.754 "data_size": 63488 00:25:29.754 } 00:25:29.754 ] 00:25:29.754 }' 00:25:29.754 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.754 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.754 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.754 [2024-07-15 13:44:09.169344] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:30.012 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:30.012 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:30.012 [2024-07-15 13:44:09.279114] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:30.012 [2024-07-15 13:44:09.279712] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:30.012 [2024-07-15 13:44:09.427953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:30.577 [2024-07-15 13:44:09.696377] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xe2d670 00:25:30.577 [2024-07-15 13:44:09.696412] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xf628f0 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.577 13:44:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.577 [2024-07-15 13:44:09.845417] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:30.834 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.834 "name": "raid_bdev1", 00:25:30.834 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:30.834 "strip_size_kb": 0, 00:25:30.834 "state": "online", 00:25:30.834 "raid_level": "raid1", 00:25:30.834 "superblock": true, 00:25:30.834 "num_base_bdevs": 4, 00:25:30.834 "num_base_bdevs_discovered": 3, 00:25:30.834 "num_base_bdevs_operational": 3, 00:25:30.834 "process": { 00:25:30.834 "type": "rebuild", 00:25:30.834 "target": "spare", 00:25:30.834 "progress": { 00:25:30.834 "blocks": 30720, 00:25:30.834 "percent": 48 00:25:30.834 } 00:25:30.834 }, 00:25:30.834 "base_bdevs_list": [ 00:25:30.834 { 00:25:30.834 "name": "spare", 00:25:30.834 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:30.834 "is_configured": true, 00:25:30.834 "data_offset": 2048, 00:25:30.834 "data_size": 63488 00:25:30.834 }, 00:25:30.834 { 00:25:30.834 "name": null, 00:25:30.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.834 "is_configured": false, 00:25:30.834 "data_offset": 2048, 00:25:30.834 "data_size": 63488 00:25:30.834 }, 00:25:30.834 { 00:25:30.834 "name": "BaseBdev3", 00:25:30.834 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:30.834 "is_configured": true, 00:25:30.834 "data_offset": 2048, 00:25:30.834 "data_size": 63488 00:25:30.834 }, 00:25:30.834 { 00:25:30.834 "name": "BaseBdev4", 00:25:30.834 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:30.834 "is_configured": true, 00:25:30.834 "data_offset": 2048, 00:25:30.834 "data_size": 63488 00:25:30.834 } 00:25:30.834 ] 00:25:30.834 }' 00:25:30.834 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.091 [2024-07-15 13:44:10.298808] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:31.091 [2024-07-15 13:44:10.299672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=954 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.091 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.091 [2024-07-15 13:44:10.503710] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.347 "name": "raid_bdev1", 00:25:31.347 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:31.347 "strip_size_kb": 0, 00:25:31.347 "state": "online", 00:25:31.347 "raid_level": "raid1", 00:25:31.347 "superblock": true, 00:25:31.347 "num_base_bdevs": 4, 00:25:31.347 "num_base_bdevs_discovered": 3, 00:25:31.347 "num_base_bdevs_operational": 3, 00:25:31.347 "process": { 00:25:31.347 "type": "rebuild", 00:25:31.347 "target": "spare", 00:25:31.347 "progress": { 00:25:31.347 "blocks": 34816, 00:25:31.347 "percent": 54 00:25:31.347 } 00:25:31.347 }, 00:25:31.347 "base_bdevs_list": [ 00:25:31.347 { 00:25:31.347 "name": "spare", 00:25:31.347 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:31.347 "is_configured": true, 00:25:31.347 "data_offset": 2048, 00:25:31.347 "data_size": 63488 00:25:31.347 }, 00:25:31.347 { 00:25:31.347 "name": null, 00:25:31.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.347 "is_configured": false, 00:25:31.347 "data_offset": 2048, 00:25:31.347 "data_size": 63488 00:25:31.347 }, 00:25:31.347 { 00:25:31.347 "name": "BaseBdev3", 00:25:31.347 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:31.347 "is_configured": true, 00:25:31.347 "data_offset": 2048, 00:25:31.347 "data_size": 63488 00:25:31.347 }, 00:25:31.347 { 00:25:31.347 "name": "BaseBdev4", 00:25:31.347 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:31.347 "is_configured": true, 00:25:31.347 "data_offset": 2048, 00:25:31.347 "data_size": 63488 00:25:31.347 } 00:25:31.347 ] 00:25:31.347 }' 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.347 13:44:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:32.279 [2024-07-15 13:44:11.492573] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:32.279 [2024-07-15 13:44:11.493132] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.279 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.536 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.536 "name": "raid_bdev1", 00:25:32.536 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:32.536 "strip_size_kb": 0, 00:25:32.536 "state": "online", 00:25:32.537 "raid_level": "raid1", 00:25:32.537 "superblock": true, 00:25:32.537 "num_base_bdevs": 4, 00:25:32.537 "num_base_bdevs_discovered": 3, 00:25:32.537 "num_base_bdevs_operational": 3, 00:25:32.537 "process": { 00:25:32.537 "type": "rebuild", 00:25:32.537 "target": "spare", 00:25:32.537 "progress": { 00:25:32.537 "blocks": 55296, 00:25:32.537 "percent": 87 00:25:32.537 } 00:25:32.537 }, 00:25:32.537 "base_bdevs_list": [ 00:25:32.537 { 00:25:32.537 "name": "spare", 00:25:32.537 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:32.537 "is_configured": true, 00:25:32.537 "data_offset": 2048, 00:25:32.537 "data_size": 63488 00:25:32.537 }, 00:25:32.537 { 00:25:32.537 "name": null, 00:25:32.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.537 "is_configured": false, 00:25:32.537 "data_offset": 2048, 00:25:32.537 "data_size": 63488 00:25:32.537 }, 00:25:32.537 { 00:25:32.537 "name": "BaseBdev3", 00:25:32.537 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:32.537 "is_configured": true, 00:25:32.537 "data_offset": 2048, 00:25:32.537 "data_size": 63488 00:25:32.537 }, 00:25:32.537 { 00:25:32.537 "name": "BaseBdev4", 00:25:32.537 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:32.537 "is_configured": true, 00:25:32.537 "data_offset": 2048, 00:25:32.537 "data_size": 63488 00:25:32.537 } 00:25:32.537 ] 00:25:32.537 }' 00:25:32.537 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.794 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:32.794 13:44:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.794 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:32.794 13:44:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:32.794 [2024-07-15 13:44:12.046170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:25:33.051 [2024-07-15 13:44:12.286097] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:33.051 [2024-07-15 13:44:12.394374] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:33.051 [2024-07-15 13:44:12.397856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.616 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.874 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.874 "name": "raid_bdev1", 00:25:33.874 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:33.874 "strip_size_kb": 0, 00:25:33.874 "state": "online", 00:25:33.874 "raid_level": "raid1", 00:25:33.874 "superblock": true, 00:25:33.874 "num_base_bdevs": 4, 00:25:33.874 "num_base_bdevs_discovered": 3, 00:25:33.874 "num_base_bdevs_operational": 3, 00:25:33.874 "base_bdevs_list": [ 00:25:33.874 { 00:25:33.874 "name": "spare", 00:25:33.874 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:33.874 "is_configured": true, 00:25:33.874 "data_offset": 2048, 00:25:33.874 "data_size": 63488 00:25:33.874 }, 00:25:33.874 { 00:25:33.874 "name": null, 00:25:33.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.874 "is_configured": false, 00:25:33.874 "data_offset": 2048, 00:25:33.874 "data_size": 63488 00:25:33.874 }, 00:25:33.874 { 00:25:33.874 "name": "BaseBdev3", 00:25:33.874 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:33.874 "is_configured": true, 00:25:33.874 "data_offset": 2048, 00:25:33.874 "data_size": 63488 00:25:33.874 }, 00:25:33.874 { 00:25:33.874 "name": "BaseBdev4", 00:25:33.874 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:33.874 "is_configured": true, 00:25:33.874 "data_offset": 2048, 00:25:33.874 "data_size": 63488 00:25:33.874 } 00:25:33.874 ] 00:25:33.874 }' 00:25:33.874 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.131 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.387 "name": "raid_bdev1", 00:25:34.387 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:34.387 "strip_size_kb": 0, 00:25:34.387 "state": "online", 00:25:34.387 "raid_level": "raid1", 00:25:34.387 "superblock": true, 00:25:34.387 "num_base_bdevs": 4, 00:25:34.387 "num_base_bdevs_discovered": 3, 00:25:34.387 "num_base_bdevs_operational": 3, 00:25:34.387 "base_bdevs_list": [ 00:25:34.387 { 00:25:34.387 "name": "spare", 00:25:34.387 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:34.387 "is_configured": true, 00:25:34.387 "data_offset": 2048, 00:25:34.387 "data_size": 63488 00:25:34.387 }, 00:25:34.387 { 00:25:34.387 "name": null, 00:25:34.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.387 "is_configured": false, 00:25:34.387 "data_offset": 2048, 00:25:34.387 "data_size": 63488 00:25:34.387 }, 00:25:34.387 { 00:25:34.387 "name": "BaseBdev3", 00:25:34.387 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:34.387 "is_configured": true, 00:25:34.387 "data_offset": 2048, 00:25:34.387 "data_size": 63488 00:25:34.387 }, 00:25:34.387 { 00:25:34.387 "name": "BaseBdev4", 00:25:34.387 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:34.387 "is_configured": true, 00:25:34.387 "data_offset": 2048, 00:25:34.387 "data_size": 63488 00:25:34.387 } 00:25:34.387 ] 00:25:34.387 }' 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.387 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.645 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.645 "name": "raid_bdev1", 00:25:34.645 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:34.645 "strip_size_kb": 0, 00:25:34.645 "state": "online", 00:25:34.645 "raid_level": "raid1", 00:25:34.645 "superblock": true, 00:25:34.645 "num_base_bdevs": 4, 00:25:34.645 "num_base_bdevs_discovered": 3, 00:25:34.645 "num_base_bdevs_operational": 3, 00:25:34.645 "base_bdevs_list": [ 00:25:34.645 { 00:25:34.645 "name": "spare", 00:25:34.645 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:34.645 "is_configured": true, 00:25:34.645 "data_offset": 2048, 00:25:34.645 "data_size": 63488 00:25:34.645 }, 00:25:34.645 { 00:25:34.645 "name": null, 00:25:34.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.645 "is_configured": false, 00:25:34.645 "data_offset": 2048, 00:25:34.645 "data_size": 63488 00:25:34.645 }, 00:25:34.645 { 00:25:34.645 "name": "BaseBdev3", 00:25:34.645 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:34.645 "is_configured": true, 00:25:34.645 "data_offset": 2048, 00:25:34.645 "data_size": 63488 00:25:34.645 }, 00:25:34.645 { 00:25:34.645 "name": "BaseBdev4", 00:25:34.645 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:34.645 "is_configured": true, 00:25:34.645 "data_offset": 2048, 00:25:34.645 "data_size": 63488 00:25:34.645 } 00:25:34.645 ] 00:25:34.645 }' 00:25:34.645 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.645 13:44:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.209 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:35.466 [2024-07-15 13:44:14.757233] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:35.466 [2024-07-15 13:44:14.757268] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:35.466 00:25:35.466 Latency(us) 00:25:35.466 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:35.466 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:35.466 raid_bdev1 : 11.42 99.41 298.23 0.00 0.00 13956.51 297.41 121270.09 00:25:35.466 =================================================================================================================== 00:25:35.466 Total : 99.41 298.23 0.00 0.00 13956.51 297.41 121270.09 00:25:35.466 [2024-07-15 13:44:14.833375] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.466 [2024-07-15 13:44:14.833406] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:35.466 [2024-07-15 13:44:14.833499] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:35.466 [2024-07-15 13:44:14.833512] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf5e8a0 name raid_bdev1, state offline 00:25:35.466 0 00:25:35.466 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.466 13:44:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:35.723 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:35.980 /dev/nbd0 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:35.980 1+0 records in 00:25:35.980 1+0 records out 00:25:35.980 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000420245 s, 9.7 MB/s 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:35.980 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:36.237 /dev/nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.237 1+0 records in 00:25:36.237 1+0 records out 00:25:36.237 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292592 s, 14.0 MB/s 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:36.237 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:36.494 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:36.495 13:44:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:36.752 /dev/nbd1 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:36.752 1+0 records in 00:25:36.752 1+0 records out 00:25:36.752 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253185 s, 16.2 MB/s 00:25:36.752 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.009 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:37.266 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:37.523 13:44:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:37.780 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:38.038 [2024-07-15 13:44:17.272565] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:38.038 [2024-07-15 13:44:17.272608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.038 [2024-07-15 13:44:17.272628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf60380 00:25:38.038 [2024-07-15 13:44:17.272641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.038 [2024-07-15 13:44:17.274254] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.038 [2024-07-15 13:44:17.274281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:38.038 [2024-07-15 13:44:17.274355] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:38.038 [2024-07-15 13:44:17.274382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:38.038 [2024-07-15 13:44:17.274482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:38.038 [2024-07-15 13:44:17.274557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:38.038 spare 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.038 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.038 [2024-07-15 13:44:17.374870] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf63bd0 00:25:38.038 [2024-07-15 13:44:17.374886] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:38.038 [2024-07-15 13:44:17.375094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf650e0 00:25:38.038 [2024-07-15 13:44:17.375248] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf63bd0 00:25:38.038 [2024-07-15 13:44:17.375258] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf63bd0 00:25:38.038 [2024-07-15 13:44:17.375366] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.295 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.295 "name": "raid_bdev1", 00:25:38.295 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:38.295 "strip_size_kb": 0, 00:25:38.295 "state": "online", 00:25:38.295 "raid_level": "raid1", 00:25:38.295 "superblock": true, 00:25:38.295 "num_base_bdevs": 4, 00:25:38.295 "num_base_bdevs_discovered": 3, 00:25:38.295 "num_base_bdevs_operational": 3, 00:25:38.295 "base_bdevs_list": [ 00:25:38.295 { 00:25:38.295 "name": "spare", 00:25:38.295 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:38.295 "is_configured": true, 00:25:38.295 "data_offset": 2048, 00:25:38.295 "data_size": 63488 00:25:38.295 }, 00:25:38.295 { 00:25:38.295 "name": null, 00:25:38.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.295 "is_configured": false, 00:25:38.295 "data_offset": 2048, 00:25:38.295 "data_size": 63488 00:25:38.295 }, 00:25:38.295 { 00:25:38.295 "name": "BaseBdev3", 00:25:38.295 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:38.295 "is_configured": true, 00:25:38.295 "data_offset": 2048, 00:25:38.295 "data_size": 63488 00:25:38.295 }, 00:25:38.295 { 00:25:38.295 "name": "BaseBdev4", 00:25:38.295 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:38.295 "is_configured": true, 00:25:38.295 "data_offset": 2048, 00:25:38.295 "data_size": 63488 00:25:38.295 } 00:25:38.295 ] 00:25:38.295 }' 00:25:38.295 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.295 13:44:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.859 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.116 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:39.116 "name": "raid_bdev1", 00:25:39.116 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:39.116 "strip_size_kb": 0, 00:25:39.116 "state": "online", 00:25:39.116 "raid_level": "raid1", 00:25:39.116 "superblock": true, 00:25:39.116 "num_base_bdevs": 4, 00:25:39.116 "num_base_bdevs_discovered": 3, 00:25:39.116 "num_base_bdevs_operational": 3, 00:25:39.116 "base_bdevs_list": [ 00:25:39.116 { 00:25:39.116 "name": "spare", 00:25:39.116 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:39.116 "is_configured": true, 00:25:39.116 "data_offset": 2048, 00:25:39.116 "data_size": 63488 00:25:39.116 }, 00:25:39.116 { 00:25:39.116 "name": null, 00:25:39.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.116 "is_configured": false, 00:25:39.116 "data_offset": 2048, 00:25:39.116 "data_size": 63488 00:25:39.116 }, 00:25:39.116 { 00:25:39.116 "name": "BaseBdev3", 00:25:39.116 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:39.116 "is_configured": true, 00:25:39.116 "data_offset": 2048, 00:25:39.116 "data_size": 63488 00:25:39.116 }, 00:25:39.116 { 00:25:39.116 "name": "BaseBdev4", 00:25:39.116 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:39.117 "is_configured": true, 00:25:39.117 "data_offset": 2048, 00:25:39.117 "data_size": 63488 00:25:39.117 } 00:25:39.117 ] 00:25:39.117 }' 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.117 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:39.373 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:39.374 13:44:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:39.938 [2024-07-15 13:44:19.178225] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.938 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.195 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:40.195 "name": "raid_bdev1", 00:25:40.195 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:40.195 "strip_size_kb": 0, 00:25:40.195 "state": "online", 00:25:40.195 "raid_level": "raid1", 00:25:40.195 "superblock": true, 00:25:40.195 "num_base_bdevs": 4, 00:25:40.195 "num_base_bdevs_discovered": 2, 00:25:40.195 "num_base_bdevs_operational": 2, 00:25:40.195 "base_bdevs_list": [ 00:25:40.195 { 00:25:40.195 "name": null, 00:25:40.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.195 "is_configured": false, 00:25:40.195 "data_offset": 2048, 00:25:40.195 "data_size": 63488 00:25:40.195 }, 00:25:40.195 { 00:25:40.195 "name": null, 00:25:40.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.195 "is_configured": false, 00:25:40.195 "data_offset": 2048, 00:25:40.195 "data_size": 63488 00:25:40.195 }, 00:25:40.195 { 00:25:40.195 "name": "BaseBdev3", 00:25:40.195 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:40.195 "is_configured": true, 00:25:40.195 "data_offset": 2048, 00:25:40.195 "data_size": 63488 00:25:40.195 }, 00:25:40.195 { 00:25:40.195 "name": "BaseBdev4", 00:25:40.195 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:40.195 "is_configured": true, 00:25:40.195 "data_offset": 2048, 00:25:40.195 "data_size": 63488 00:25:40.195 } 00:25:40.195 ] 00:25:40.195 }' 00:25:40.195 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:40.195 13:44:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:40.761 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:41.018 [2024-07-15 13:44:20.285318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:41.018 [2024-07-15 13:44:20.285470] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:41.018 [2024-07-15 13:44:20.285486] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:41.018 [2024-07-15 13:44:20.285514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:41.018 [2024-07-15 13:44:20.289934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb34fa0 00:25:41.018 [2024-07-15 13:44:20.292290] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:41.018 13:44:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.949 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.207 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.207 "name": "raid_bdev1", 00:25:42.207 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:42.207 "strip_size_kb": 0, 00:25:42.207 "state": "online", 00:25:42.207 "raid_level": "raid1", 00:25:42.207 "superblock": true, 00:25:42.207 "num_base_bdevs": 4, 00:25:42.207 "num_base_bdevs_discovered": 3, 00:25:42.207 "num_base_bdevs_operational": 3, 00:25:42.207 "process": { 00:25:42.207 "type": "rebuild", 00:25:42.207 "target": "spare", 00:25:42.207 "progress": { 00:25:42.207 "blocks": 24576, 00:25:42.207 "percent": 38 00:25:42.207 } 00:25:42.207 }, 00:25:42.207 "base_bdevs_list": [ 00:25:42.207 { 00:25:42.207 "name": "spare", 00:25:42.207 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:42.207 "is_configured": true, 00:25:42.207 "data_offset": 2048, 00:25:42.207 "data_size": 63488 00:25:42.207 }, 00:25:42.207 { 00:25:42.207 "name": null, 00:25:42.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.207 "is_configured": false, 00:25:42.207 "data_offset": 2048, 00:25:42.207 "data_size": 63488 00:25:42.207 }, 00:25:42.207 { 00:25:42.207 "name": "BaseBdev3", 00:25:42.207 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:42.207 "is_configured": true, 00:25:42.207 "data_offset": 2048, 00:25:42.207 "data_size": 63488 00:25:42.207 }, 00:25:42.207 { 00:25:42.207 "name": "BaseBdev4", 00:25:42.207 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:42.207 "is_configured": true, 00:25:42.207 "data_offset": 2048, 00:25:42.207 "data_size": 63488 00:25:42.207 } 00:25:42.207 ] 00:25:42.207 }' 00:25:42.207 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.207 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.207 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.473 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.473 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:42.473 [2024-07-15 13:44:21.879414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.740 [2024-07-15 13:44:21.904823] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:42.740 [2024-07-15 13:44:21.904869] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.740 [2024-07-15 13:44:21.904887] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.740 [2024-07-15 13:44:21.904895] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.740 13:44:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.305 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.305 "name": "raid_bdev1", 00:25:43.305 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:43.305 "strip_size_kb": 0, 00:25:43.305 "state": "online", 00:25:43.305 "raid_level": "raid1", 00:25:43.305 "superblock": true, 00:25:43.305 "num_base_bdevs": 4, 00:25:43.305 "num_base_bdevs_discovered": 2, 00:25:43.305 "num_base_bdevs_operational": 2, 00:25:43.305 "base_bdevs_list": [ 00:25:43.305 { 00:25:43.305 "name": null, 00:25:43.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.305 "is_configured": false, 00:25:43.305 "data_offset": 2048, 00:25:43.305 "data_size": 63488 00:25:43.305 }, 00:25:43.305 { 00:25:43.305 "name": null, 00:25:43.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.305 "is_configured": false, 00:25:43.305 "data_offset": 2048, 00:25:43.305 "data_size": 63488 00:25:43.305 }, 00:25:43.305 { 00:25:43.305 "name": "BaseBdev3", 00:25:43.305 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:43.305 "is_configured": true, 00:25:43.305 "data_offset": 2048, 00:25:43.305 "data_size": 63488 00:25:43.305 }, 00:25:43.305 { 00:25:43.305 "name": "BaseBdev4", 00:25:43.305 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:43.305 "is_configured": true, 00:25:43.305 "data_offset": 2048, 00:25:43.305 "data_size": 63488 00:25:43.305 } 00:25:43.305 ] 00:25:43.305 }' 00:25:43.305 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.305 13:44:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:43.869 13:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:43.869 [2024-07-15 13:44:23.196719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:43.869 [2024-07-15 13:44:23.196770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.869 [2024-07-15 13:44:23.196793] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf61730 00:25:43.869 [2024-07-15 13:44:23.196805] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.869 [2024-07-15 13:44:23.197184] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.869 [2024-07-15 13:44:23.197204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:43.869 [2024-07-15 13:44:23.197287] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:43.869 [2024-07-15 13:44:23.197299] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:43.869 [2024-07-15 13:44:23.197309] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:43.869 [2024-07-15 13:44:23.197328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:43.869 [2024-07-15 13:44:23.201814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf64c50 00:25:43.869 spare 00:25:43.869 [2024-07-15 13:44:23.203218] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:43.869 13:44:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.236 "name": "raid_bdev1", 00:25:45.236 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:45.236 "strip_size_kb": 0, 00:25:45.236 "state": "online", 00:25:45.236 "raid_level": "raid1", 00:25:45.236 "superblock": true, 00:25:45.236 "num_base_bdevs": 4, 00:25:45.236 "num_base_bdevs_discovered": 3, 00:25:45.236 "num_base_bdevs_operational": 3, 00:25:45.236 "process": { 00:25:45.236 "type": "rebuild", 00:25:45.236 "target": "spare", 00:25:45.236 "progress": { 00:25:45.236 "blocks": 24576, 00:25:45.236 "percent": 38 00:25:45.236 } 00:25:45.236 }, 00:25:45.236 "base_bdevs_list": [ 00:25:45.236 { 00:25:45.236 "name": "spare", 00:25:45.236 "uuid": "5bf0af31-6ccd-55eb-8b33-8dc34f4921b9", 00:25:45.236 "is_configured": true, 00:25:45.236 "data_offset": 2048, 00:25:45.236 "data_size": 63488 00:25:45.236 }, 00:25:45.236 { 00:25:45.236 "name": null, 00:25:45.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.236 "is_configured": false, 00:25:45.236 "data_offset": 2048, 00:25:45.236 "data_size": 63488 00:25:45.236 }, 00:25:45.236 { 00:25:45.236 "name": "BaseBdev3", 00:25:45.236 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:45.236 "is_configured": true, 00:25:45.236 "data_offset": 2048, 00:25:45.236 "data_size": 63488 00:25:45.236 }, 00:25:45.236 { 00:25:45.236 "name": "BaseBdev4", 00:25:45.236 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:45.236 "is_configured": true, 00:25:45.236 "data_offset": 2048, 00:25:45.236 "data_size": 63488 00:25:45.236 } 00:25:45.236 ] 00:25:45.236 }' 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:45.236 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:45.493 [2024-07-15 13:44:24.787634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:45.493 [2024-07-15 13:44:24.815644] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:45.493 [2024-07-15 13:44:24.815687] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:45.493 [2024-07-15 13:44:24.815704] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:45.493 [2024-07-15 13:44:24.815712] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.493 13:44:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.750 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.750 "name": "raid_bdev1", 00:25:45.750 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:45.750 "strip_size_kb": 0, 00:25:45.750 "state": "online", 00:25:45.750 "raid_level": "raid1", 00:25:45.750 "superblock": true, 00:25:45.750 "num_base_bdevs": 4, 00:25:45.750 "num_base_bdevs_discovered": 2, 00:25:45.750 "num_base_bdevs_operational": 2, 00:25:45.750 "base_bdevs_list": [ 00:25:45.750 { 00:25:45.750 "name": null, 00:25:45.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.750 "is_configured": false, 00:25:45.750 "data_offset": 2048, 00:25:45.750 "data_size": 63488 00:25:45.750 }, 00:25:45.750 { 00:25:45.750 "name": null, 00:25:45.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.750 "is_configured": false, 00:25:45.750 "data_offset": 2048, 00:25:45.750 "data_size": 63488 00:25:45.750 }, 00:25:45.750 { 00:25:45.750 "name": "BaseBdev3", 00:25:45.750 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:45.750 "is_configured": true, 00:25:45.750 "data_offset": 2048, 00:25:45.750 "data_size": 63488 00:25:45.750 }, 00:25:45.750 { 00:25:45.750 "name": "BaseBdev4", 00:25:45.750 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:45.750 "is_configured": true, 00:25:45.750 "data_offset": 2048, 00:25:45.750 "data_size": 63488 00:25:45.750 } 00:25:45.750 ] 00:25:45.750 }' 00:25:45.750 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.750 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.313 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.570 "name": "raid_bdev1", 00:25:46.570 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:46.570 "strip_size_kb": 0, 00:25:46.570 "state": "online", 00:25:46.570 "raid_level": "raid1", 00:25:46.570 "superblock": true, 00:25:46.570 "num_base_bdevs": 4, 00:25:46.570 "num_base_bdevs_discovered": 2, 00:25:46.570 "num_base_bdevs_operational": 2, 00:25:46.570 "base_bdevs_list": [ 00:25:46.570 { 00:25:46.570 "name": null, 00:25:46.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.570 "is_configured": false, 00:25:46.570 "data_offset": 2048, 00:25:46.570 "data_size": 63488 00:25:46.570 }, 00:25:46.570 { 00:25:46.570 "name": null, 00:25:46.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.570 "is_configured": false, 00:25:46.570 "data_offset": 2048, 00:25:46.570 "data_size": 63488 00:25:46.570 }, 00:25:46.570 { 00:25:46.570 "name": "BaseBdev3", 00:25:46.570 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:46.570 "is_configured": true, 00:25:46.570 "data_offset": 2048, 00:25:46.570 "data_size": 63488 00:25:46.570 }, 00:25:46.570 { 00:25:46.570 "name": "BaseBdev4", 00:25:46.570 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:46.570 "is_configured": true, 00:25:46.570 "data_offset": 2048, 00:25:46.570 "data_size": 63488 00:25:46.570 } 00:25:46.570 ] 00:25:46.570 }' 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:46.570 13:44:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:46.827 13:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:47.390 [2024-07-15 13:44:26.677109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:47.390 [2024-07-15 13:44:26.677159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:47.390 [2024-07-15 13:44:26.677181] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf64ee0 00:25:47.390 [2024-07-15 13:44:26.677194] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:47.390 [2024-07-15 13:44:26.677537] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:47.390 [2024-07-15 13:44:26.677555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:47.390 [2024-07-15 13:44:26.677620] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:47.390 [2024-07-15 13:44:26.677632] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:47.390 [2024-07-15 13:44:26.677643] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:47.390 BaseBdev1 00:25:47.390 13:44:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.321 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.592 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.592 "name": "raid_bdev1", 00:25:48.592 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:48.592 "strip_size_kb": 0, 00:25:48.592 "state": "online", 00:25:48.592 "raid_level": "raid1", 00:25:48.592 "superblock": true, 00:25:48.592 "num_base_bdevs": 4, 00:25:48.592 "num_base_bdevs_discovered": 2, 00:25:48.592 "num_base_bdevs_operational": 2, 00:25:48.592 "base_bdevs_list": [ 00:25:48.592 { 00:25:48.592 "name": null, 00:25:48.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.592 "is_configured": false, 00:25:48.592 "data_offset": 2048, 00:25:48.592 "data_size": 63488 00:25:48.592 }, 00:25:48.592 { 00:25:48.592 "name": null, 00:25:48.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.592 "is_configured": false, 00:25:48.592 "data_offset": 2048, 00:25:48.592 "data_size": 63488 00:25:48.592 }, 00:25:48.592 { 00:25:48.592 "name": "BaseBdev3", 00:25:48.592 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:48.592 "is_configured": true, 00:25:48.592 "data_offset": 2048, 00:25:48.592 "data_size": 63488 00:25:48.592 }, 00:25:48.592 { 00:25:48.592 "name": "BaseBdev4", 00:25:48.592 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:48.592 "is_configured": true, 00:25:48.592 "data_offset": 2048, 00:25:48.592 "data_size": 63488 00:25:48.592 } 00:25:48.592 ] 00:25:48.592 }' 00:25:48.592 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.592 13:44:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.155 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.411 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:49.411 "name": "raid_bdev1", 00:25:49.411 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:49.412 "strip_size_kb": 0, 00:25:49.412 "state": "online", 00:25:49.412 "raid_level": "raid1", 00:25:49.412 "superblock": true, 00:25:49.412 "num_base_bdevs": 4, 00:25:49.412 "num_base_bdevs_discovered": 2, 00:25:49.412 "num_base_bdevs_operational": 2, 00:25:49.412 "base_bdevs_list": [ 00:25:49.412 { 00:25:49.412 "name": null, 00:25:49.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.412 "is_configured": false, 00:25:49.412 "data_offset": 2048, 00:25:49.412 "data_size": 63488 00:25:49.412 }, 00:25:49.412 { 00:25:49.412 "name": null, 00:25:49.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.412 "is_configured": false, 00:25:49.412 "data_offset": 2048, 00:25:49.412 "data_size": 63488 00:25:49.412 }, 00:25:49.412 { 00:25:49.412 "name": "BaseBdev3", 00:25:49.412 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:49.412 "is_configured": true, 00:25:49.412 "data_offset": 2048, 00:25:49.412 "data_size": 63488 00:25:49.412 }, 00:25:49.412 { 00:25:49.412 "name": "BaseBdev4", 00:25:49.412 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:49.412 "is_configured": true, 00:25:49.412 "data_offset": 2048, 00:25:49.412 "data_size": 63488 00:25:49.412 } 00:25:49.412 ] 00:25:49.412 }' 00:25:49.412 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:49.667 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:49.667 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:49.668 13:44:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:49.924 [2024-07-15 13:44:29.136003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:49.924 [2024-07-15 13:44:29.136133] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:49.924 [2024-07-15 13:44:29.136149] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:49.924 request: 00:25:49.924 { 00:25:49.924 "base_bdev": "BaseBdev1", 00:25:49.924 "raid_bdev": "raid_bdev1", 00:25:49.924 "method": "bdev_raid_add_base_bdev", 00:25:49.924 "req_id": 1 00:25:49.924 } 00:25:49.924 Got JSON-RPC error response 00:25:49.924 response: 00:25:49.924 { 00:25:49.924 "code": -22, 00:25:49.924 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:49.924 } 00:25:49.924 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:25:49.924 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:49.924 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:49.924 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:49.924 13:44:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.854 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.111 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.111 "name": "raid_bdev1", 00:25:51.111 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:51.111 "strip_size_kb": 0, 00:25:51.111 "state": "online", 00:25:51.111 "raid_level": "raid1", 00:25:51.111 "superblock": true, 00:25:51.111 "num_base_bdevs": 4, 00:25:51.111 "num_base_bdevs_discovered": 2, 00:25:51.111 "num_base_bdevs_operational": 2, 00:25:51.111 "base_bdevs_list": [ 00:25:51.111 { 00:25:51.111 "name": null, 00:25:51.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.111 "is_configured": false, 00:25:51.111 "data_offset": 2048, 00:25:51.111 "data_size": 63488 00:25:51.111 }, 00:25:51.111 { 00:25:51.111 "name": null, 00:25:51.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.111 "is_configured": false, 00:25:51.111 "data_offset": 2048, 00:25:51.111 "data_size": 63488 00:25:51.111 }, 00:25:51.111 { 00:25:51.111 "name": "BaseBdev3", 00:25:51.111 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:51.111 "is_configured": true, 00:25:51.111 "data_offset": 2048, 00:25:51.111 "data_size": 63488 00:25:51.111 }, 00:25:51.111 { 00:25:51.112 "name": "BaseBdev4", 00:25:51.112 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:51.112 "is_configured": true, 00:25:51.112 "data_offset": 2048, 00:25:51.112 "data_size": 63488 00:25:51.112 } 00:25:51.112 ] 00:25:51.112 }' 00:25:51.112 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.112 13:44:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.675 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.932 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.932 "name": "raid_bdev1", 00:25:51.932 "uuid": "c401d370-958b-42e4-9469-91520492d9e4", 00:25:51.932 "strip_size_kb": 0, 00:25:51.932 "state": "online", 00:25:51.932 "raid_level": "raid1", 00:25:51.932 "superblock": true, 00:25:51.932 "num_base_bdevs": 4, 00:25:51.932 "num_base_bdevs_discovered": 2, 00:25:51.932 "num_base_bdevs_operational": 2, 00:25:51.932 "base_bdevs_list": [ 00:25:51.932 { 00:25:51.932 "name": null, 00:25:51.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.932 "is_configured": false, 00:25:51.932 "data_offset": 2048, 00:25:51.932 "data_size": 63488 00:25:51.932 }, 00:25:51.932 { 00:25:51.932 "name": null, 00:25:51.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.932 "is_configured": false, 00:25:51.932 "data_offset": 2048, 00:25:51.932 "data_size": 63488 00:25:51.932 }, 00:25:51.932 { 00:25:51.932 "name": "BaseBdev3", 00:25:51.932 "uuid": "011edf8d-ad7c-55b1-b028-79cd5d839587", 00:25:51.932 "is_configured": true, 00:25:51.932 "data_offset": 2048, 00:25:51.932 "data_size": 63488 00:25:51.932 }, 00:25:51.932 { 00:25:51.932 "name": "BaseBdev4", 00:25:51.932 "uuid": "45e89881-ac14-587c-abd6-56c001ec5726", 00:25:51.932 "is_configured": true, 00:25:51.932 "data_offset": 2048, 00:25:51.932 "data_size": 63488 00:25:51.932 } 00:25:51.932 ] 00:25:51.932 }' 00:25:51.932 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.932 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:51.932 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2201037 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2201037 ']' 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2201037 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2201037 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2201037' 00:25:52.190 killing process with pid 2201037 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2201037 00:25:52.190 Received shutdown signal, test time was about 27.962774 seconds 00:25:52.190 00:25:52.190 Latency(us) 00:25:52.190 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.190 =================================================================================================================== 00:25:52.190 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:52.190 [2024-07-15 13:44:31.415264] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:52.190 [2024-07-15 13:44:31.415368] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:52.190 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2201037 00:25:52.190 [2024-07-15 13:44:31.415435] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:52.190 [2024-07-15 13:44:31.415449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf63bd0 name raid_bdev1, state offline 00:25:52.190 [2024-07-15 13:44:31.460897] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:52.448 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:52.448 00:25:52.448 real 0m34.772s 00:25:52.448 user 0m55.281s 00:25:52.448 sys 0m5.574s 00:25:52.448 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:52.448 13:44:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:52.448 ************************************ 00:25:52.448 END TEST raid_rebuild_test_sb_io 00:25:52.448 ************************************ 00:25:52.448 13:44:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:52.448 13:44:31 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:25:52.448 13:44:31 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:25:52.448 13:44:31 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:25:52.448 13:44:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:52.448 13:44:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:52.448 13:44:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:52.448 ************************************ 00:25:52.448 START TEST raid_state_function_test_sb_4k 00:25:52.448 ************************************ 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2206058 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2206058' 00:25:52.448 Process raid pid: 2206058 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2206058 /var/tmp/spdk-raid.sock 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2206058 ']' 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:52.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:52.448 13:44:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:52.449 [2024-07-15 13:44:31.839350] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:25:52.449 [2024-07-15 13:44:31.839415] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:52.706 [2024-07-15 13:44:31.966183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.706 [2024-07-15 13:44:32.070079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.706 [2024-07-15 13:44:32.130374] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:52.706 [2024-07-15 13:44:32.130398] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:53.638 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:53.638 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:25:53.638 13:44:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:53.638 [2024-07-15 13:44:33.004461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:53.638 [2024-07-15 13:44:33.004504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:53.638 [2024-07-15 13:44:33.004515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:53.638 [2024-07-15 13:44:33.004527] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.638 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:53.908 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.908 "name": "Existed_Raid", 00:25:53.908 "uuid": "be7a0f00-b267-4102-a238-0d5c7dfddaf1", 00:25:53.908 "strip_size_kb": 0, 00:25:53.908 "state": "configuring", 00:25:53.908 "raid_level": "raid1", 00:25:53.908 "superblock": true, 00:25:53.908 "num_base_bdevs": 2, 00:25:53.908 "num_base_bdevs_discovered": 0, 00:25:53.908 "num_base_bdevs_operational": 2, 00:25:53.908 "base_bdevs_list": [ 00:25:53.908 { 00:25:53.908 "name": "BaseBdev1", 00:25:53.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.908 "is_configured": false, 00:25:53.908 "data_offset": 0, 00:25:53.908 "data_size": 0 00:25:53.908 }, 00:25:53.908 { 00:25:53.908 "name": "BaseBdev2", 00:25:53.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.908 "is_configured": false, 00:25:53.908 "data_offset": 0, 00:25:53.908 "data_size": 0 00:25:53.908 } 00:25:53.908 ] 00:25:53.908 }' 00:25:53.908 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.908 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:54.472 13:44:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:54.729 [2024-07-15 13:44:34.103240] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:54.729 [2024-07-15 13:44:34.103271] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa70a80 name Existed_Raid, state configuring 00:25:54.729 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:54.987 [2024-07-15 13:44:34.347898] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:54.987 [2024-07-15 13:44:34.347935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:54.987 [2024-07-15 13:44:34.347945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:54.987 [2024-07-15 13:44:34.347957] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:54.987 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:25:55.244 [2024-07-15 13:44:34.602429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:55.244 BaseBdev1 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:55.244 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:55.501 13:44:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:55.759 [ 00:25:55.759 { 00:25:55.759 "name": "BaseBdev1", 00:25:55.759 "aliases": [ 00:25:55.759 "1806f06f-33f7-4094-8f8c-5f3753c1ba0d" 00:25:55.759 ], 00:25:55.759 "product_name": "Malloc disk", 00:25:55.759 "block_size": 4096, 00:25:55.759 "num_blocks": 8192, 00:25:55.759 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:25:55.759 "assigned_rate_limits": { 00:25:55.759 "rw_ios_per_sec": 0, 00:25:55.759 "rw_mbytes_per_sec": 0, 00:25:55.759 "r_mbytes_per_sec": 0, 00:25:55.759 "w_mbytes_per_sec": 0 00:25:55.759 }, 00:25:55.759 "claimed": true, 00:25:55.759 "claim_type": "exclusive_write", 00:25:55.759 "zoned": false, 00:25:55.759 "supported_io_types": { 00:25:55.759 "read": true, 00:25:55.759 "write": true, 00:25:55.759 "unmap": true, 00:25:55.759 "flush": true, 00:25:55.759 "reset": true, 00:25:55.759 "nvme_admin": false, 00:25:55.759 "nvme_io": false, 00:25:55.759 "nvme_io_md": false, 00:25:55.759 "write_zeroes": true, 00:25:55.759 "zcopy": true, 00:25:55.759 "get_zone_info": false, 00:25:55.759 "zone_management": false, 00:25:55.759 "zone_append": false, 00:25:55.759 "compare": false, 00:25:55.759 "compare_and_write": false, 00:25:55.759 "abort": true, 00:25:55.759 "seek_hole": false, 00:25:55.759 "seek_data": false, 00:25:55.759 "copy": true, 00:25:55.759 "nvme_iov_md": false 00:25:55.759 }, 00:25:55.759 "memory_domains": [ 00:25:55.759 { 00:25:55.759 "dma_device_id": "system", 00:25:55.759 "dma_device_type": 1 00:25:55.759 }, 00:25:55.759 { 00:25:55.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:55.759 "dma_device_type": 2 00:25:55.759 } 00:25:55.759 ], 00:25:55.759 "driver_specific": {} 00:25:55.759 } 00:25:55.759 ] 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.759 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:56.017 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.017 "name": "Existed_Raid", 00:25:56.017 "uuid": "2631495c-7550-4064-a379-2a9c57f676e6", 00:25:56.017 "strip_size_kb": 0, 00:25:56.017 "state": "configuring", 00:25:56.017 "raid_level": "raid1", 00:25:56.017 "superblock": true, 00:25:56.017 "num_base_bdevs": 2, 00:25:56.017 "num_base_bdevs_discovered": 1, 00:25:56.017 "num_base_bdevs_operational": 2, 00:25:56.017 "base_bdevs_list": [ 00:25:56.017 { 00:25:56.017 "name": "BaseBdev1", 00:25:56.017 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:25:56.017 "is_configured": true, 00:25:56.017 "data_offset": 256, 00:25:56.017 "data_size": 7936 00:25:56.017 }, 00:25:56.017 { 00:25:56.017 "name": "BaseBdev2", 00:25:56.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.017 "is_configured": false, 00:25:56.017 "data_offset": 0, 00:25:56.017 "data_size": 0 00:25:56.017 } 00:25:56.017 ] 00:25:56.017 }' 00:25:56.017 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.017 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:56.580 13:44:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:56.837 [2024-07-15 13:44:36.142513] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:56.837 [2024-07-15 13:44:36.142553] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa70350 name Existed_Raid, state configuring 00:25:56.837 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:57.093 [2024-07-15 13:44:36.387192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:57.094 [2024-07-15 13:44:36.388679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:57.094 [2024-07-15 13:44:36.388712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.094 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:57.381 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.381 "name": "Existed_Raid", 00:25:57.381 "uuid": "104544cf-3968-4476-bae7-299ac46ac75d", 00:25:57.381 "strip_size_kb": 0, 00:25:57.381 "state": "configuring", 00:25:57.381 "raid_level": "raid1", 00:25:57.381 "superblock": true, 00:25:57.381 "num_base_bdevs": 2, 00:25:57.381 "num_base_bdevs_discovered": 1, 00:25:57.381 "num_base_bdevs_operational": 2, 00:25:57.381 "base_bdevs_list": [ 00:25:57.381 { 00:25:57.381 "name": "BaseBdev1", 00:25:57.381 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:25:57.381 "is_configured": true, 00:25:57.381 "data_offset": 256, 00:25:57.381 "data_size": 7936 00:25:57.381 }, 00:25:57.381 { 00:25:57.381 "name": "BaseBdev2", 00:25:57.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.381 "is_configured": false, 00:25:57.381 "data_offset": 0, 00:25:57.381 "data_size": 0 00:25:57.381 } 00:25:57.381 ] 00:25:57.381 }' 00:25:57.381 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.381 13:44:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:57.952 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:25:58.210 [2024-07-15 13:44:37.513466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:58.210 [2024-07-15 13:44:37.513616] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa71000 00:25:58.210 [2024-07-15 13:44:37.513630] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:58.210 [2024-07-15 13:44:37.513801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98b0c0 00:25:58.210 [2024-07-15 13:44:37.513920] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa71000 00:25:58.210 [2024-07-15 13:44:37.513940] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa71000 00:25:58.210 [2024-07-15 13:44:37.514033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:58.210 BaseBdev2 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:58.210 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:58.468 13:44:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:58.725 [ 00:25:58.725 { 00:25:58.725 "name": "BaseBdev2", 00:25:58.725 "aliases": [ 00:25:58.725 "71f2639f-df1e-41bf-bda8-e193b8907a7c" 00:25:58.725 ], 00:25:58.725 "product_name": "Malloc disk", 00:25:58.725 "block_size": 4096, 00:25:58.725 "num_blocks": 8192, 00:25:58.725 "uuid": "71f2639f-df1e-41bf-bda8-e193b8907a7c", 00:25:58.725 "assigned_rate_limits": { 00:25:58.725 "rw_ios_per_sec": 0, 00:25:58.725 "rw_mbytes_per_sec": 0, 00:25:58.725 "r_mbytes_per_sec": 0, 00:25:58.725 "w_mbytes_per_sec": 0 00:25:58.725 }, 00:25:58.725 "claimed": true, 00:25:58.725 "claim_type": "exclusive_write", 00:25:58.725 "zoned": false, 00:25:58.725 "supported_io_types": { 00:25:58.725 "read": true, 00:25:58.725 "write": true, 00:25:58.725 "unmap": true, 00:25:58.725 "flush": true, 00:25:58.725 "reset": true, 00:25:58.725 "nvme_admin": false, 00:25:58.725 "nvme_io": false, 00:25:58.725 "nvme_io_md": false, 00:25:58.725 "write_zeroes": true, 00:25:58.725 "zcopy": true, 00:25:58.725 "get_zone_info": false, 00:25:58.725 "zone_management": false, 00:25:58.725 "zone_append": false, 00:25:58.725 "compare": false, 00:25:58.725 "compare_and_write": false, 00:25:58.725 "abort": true, 00:25:58.725 "seek_hole": false, 00:25:58.725 "seek_data": false, 00:25:58.725 "copy": true, 00:25:58.725 "nvme_iov_md": false 00:25:58.725 }, 00:25:58.725 "memory_domains": [ 00:25:58.725 { 00:25:58.725 "dma_device_id": "system", 00:25:58.725 "dma_device_type": 1 00:25:58.725 }, 00:25:58.725 { 00:25:58.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:58.725 "dma_device_type": 2 00:25:58.725 } 00:25:58.725 ], 00:25:58.725 "driver_specific": {} 00:25:58.725 } 00:25:58.725 ] 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.725 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:58.982 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.982 "name": "Existed_Raid", 00:25:58.982 "uuid": "104544cf-3968-4476-bae7-299ac46ac75d", 00:25:58.982 "strip_size_kb": 0, 00:25:58.982 "state": "online", 00:25:58.982 "raid_level": "raid1", 00:25:58.982 "superblock": true, 00:25:58.982 "num_base_bdevs": 2, 00:25:58.982 "num_base_bdevs_discovered": 2, 00:25:58.982 "num_base_bdevs_operational": 2, 00:25:58.982 "base_bdevs_list": [ 00:25:58.982 { 00:25:58.982 "name": "BaseBdev1", 00:25:58.982 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:25:58.982 "is_configured": true, 00:25:58.982 "data_offset": 256, 00:25:58.982 "data_size": 7936 00:25:58.982 }, 00:25:58.982 { 00:25:58.982 "name": "BaseBdev2", 00:25:58.982 "uuid": "71f2639f-df1e-41bf-bda8-e193b8907a7c", 00:25:58.982 "is_configured": true, 00:25:58.982 "data_offset": 256, 00:25:58.982 "data_size": 7936 00:25:58.982 } 00:25:58.982 ] 00:25:58.982 }' 00:25:58.982 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.982 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:59.549 13:44:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:59.806 [2024-07-15 13:44:39.037791] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:59.806 "name": "Existed_Raid", 00:25:59.806 "aliases": [ 00:25:59.806 "104544cf-3968-4476-bae7-299ac46ac75d" 00:25:59.806 ], 00:25:59.806 "product_name": "Raid Volume", 00:25:59.806 "block_size": 4096, 00:25:59.806 "num_blocks": 7936, 00:25:59.806 "uuid": "104544cf-3968-4476-bae7-299ac46ac75d", 00:25:59.806 "assigned_rate_limits": { 00:25:59.806 "rw_ios_per_sec": 0, 00:25:59.806 "rw_mbytes_per_sec": 0, 00:25:59.806 "r_mbytes_per_sec": 0, 00:25:59.806 "w_mbytes_per_sec": 0 00:25:59.806 }, 00:25:59.806 "claimed": false, 00:25:59.806 "zoned": false, 00:25:59.806 "supported_io_types": { 00:25:59.806 "read": true, 00:25:59.806 "write": true, 00:25:59.806 "unmap": false, 00:25:59.806 "flush": false, 00:25:59.806 "reset": true, 00:25:59.806 "nvme_admin": false, 00:25:59.806 "nvme_io": false, 00:25:59.806 "nvme_io_md": false, 00:25:59.806 "write_zeroes": true, 00:25:59.806 "zcopy": false, 00:25:59.806 "get_zone_info": false, 00:25:59.806 "zone_management": false, 00:25:59.806 "zone_append": false, 00:25:59.806 "compare": false, 00:25:59.806 "compare_and_write": false, 00:25:59.806 "abort": false, 00:25:59.806 "seek_hole": false, 00:25:59.806 "seek_data": false, 00:25:59.806 "copy": false, 00:25:59.806 "nvme_iov_md": false 00:25:59.806 }, 00:25:59.806 "memory_domains": [ 00:25:59.806 { 00:25:59.806 "dma_device_id": "system", 00:25:59.806 "dma_device_type": 1 00:25:59.806 }, 00:25:59.806 { 00:25:59.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:59.806 "dma_device_type": 2 00:25:59.806 }, 00:25:59.806 { 00:25:59.806 "dma_device_id": "system", 00:25:59.806 "dma_device_type": 1 00:25:59.806 }, 00:25:59.806 { 00:25:59.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:59.806 "dma_device_type": 2 00:25:59.806 } 00:25:59.806 ], 00:25:59.806 "driver_specific": { 00:25:59.806 "raid": { 00:25:59.806 "uuid": "104544cf-3968-4476-bae7-299ac46ac75d", 00:25:59.806 "strip_size_kb": 0, 00:25:59.806 "state": "online", 00:25:59.806 "raid_level": "raid1", 00:25:59.806 "superblock": true, 00:25:59.806 "num_base_bdevs": 2, 00:25:59.806 "num_base_bdevs_discovered": 2, 00:25:59.806 "num_base_bdevs_operational": 2, 00:25:59.806 "base_bdevs_list": [ 00:25:59.806 { 00:25:59.806 "name": "BaseBdev1", 00:25:59.806 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:25:59.806 "is_configured": true, 00:25:59.806 "data_offset": 256, 00:25:59.806 "data_size": 7936 00:25:59.806 }, 00:25:59.806 { 00:25:59.806 "name": "BaseBdev2", 00:25:59.806 "uuid": "71f2639f-df1e-41bf-bda8-e193b8907a7c", 00:25:59.806 "is_configured": true, 00:25:59.806 "data_offset": 256, 00:25:59.806 "data_size": 7936 00:25:59.806 } 00:25:59.806 ] 00:25:59.806 } 00:25:59.806 } 00:25:59.806 }' 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:59.806 BaseBdev2' 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:59.806 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:00.063 "name": "BaseBdev1", 00:26:00.063 "aliases": [ 00:26:00.063 "1806f06f-33f7-4094-8f8c-5f3753c1ba0d" 00:26:00.063 ], 00:26:00.063 "product_name": "Malloc disk", 00:26:00.063 "block_size": 4096, 00:26:00.063 "num_blocks": 8192, 00:26:00.063 "uuid": "1806f06f-33f7-4094-8f8c-5f3753c1ba0d", 00:26:00.063 "assigned_rate_limits": { 00:26:00.063 "rw_ios_per_sec": 0, 00:26:00.063 "rw_mbytes_per_sec": 0, 00:26:00.063 "r_mbytes_per_sec": 0, 00:26:00.063 "w_mbytes_per_sec": 0 00:26:00.063 }, 00:26:00.063 "claimed": true, 00:26:00.063 "claim_type": "exclusive_write", 00:26:00.063 "zoned": false, 00:26:00.063 "supported_io_types": { 00:26:00.063 "read": true, 00:26:00.063 "write": true, 00:26:00.063 "unmap": true, 00:26:00.063 "flush": true, 00:26:00.063 "reset": true, 00:26:00.063 "nvme_admin": false, 00:26:00.063 "nvme_io": false, 00:26:00.063 "nvme_io_md": false, 00:26:00.063 "write_zeroes": true, 00:26:00.063 "zcopy": true, 00:26:00.063 "get_zone_info": false, 00:26:00.063 "zone_management": false, 00:26:00.063 "zone_append": false, 00:26:00.063 "compare": false, 00:26:00.063 "compare_and_write": false, 00:26:00.063 "abort": true, 00:26:00.063 "seek_hole": false, 00:26:00.063 "seek_data": false, 00:26:00.063 "copy": true, 00:26:00.063 "nvme_iov_md": false 00:26:00.063 }, 00:26:00.063 "memory_domains": [ 00:26:00.063 { 00:26:00.063 "dma_device_id": "system", 00:26:00.063 "dma_device_type": 1 00:26:00.063 }, 00:26:00.063 { 00:26:00.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.063 "dma_device_type": 2 00:26:00.063 } 00:26:00.063 ], 00:26:00.063 "driver_specific": {} 00:26:00.063 }' 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.063 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:00.320 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:00.578 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:00.578 "name": "BaseBdev2", 00:26:00.578 "aliases": [ 00:26:00.578 "71f2639f-df1e-41bf-bda8-e193b8907a7c" 00:26:00.578 ], 00:26:00.578 "product_name": "Malloc disk", 00:26:00.578 "block_size": 4096, 00:26:00.578 "num_blocks": 8192, 00:26:00.578 "uuid": "71f2639f-df1e-41bf-bda8-e193b8907a7c", 00:26:00.578 "assigned_rate_limits": { 00:26:00.578 "rw_ios_per_sec": 0, 00:26:00.578 "rw_mbytes_per_sec": 0, 00:26:00.578 "r_mbytes_per_sec": 0, 00:26:00.578 "w_mbytes_per_sec": 0 00:26:00.578 }, 00:26:00.578 "claimed": true, 00:26:00.578 "claim_type": "exclusive_write", 00:26:00.578 "zoned": false, 00:26:00.578 "supported_io_types": { 00:26:00.578 "read": true, 00:26:00.578 "write": true, 00:26:00.578 "unmap": true, 00:26:00.578 "flush": true, 00:26:00.578 "reset": true, 00:26:00.578 "nvme_admin": false, 00:26:00.578 "nvme_io": false, 00:26:00.578 "nvme_io_md": false, 00:26:00.578 "write_zeroes": true, 00:26:00.578 "zcopy": true, 00:26:00.578 "get_zone_info": false, 00:26:00.578 "zone_management": false, 00:26:00.578 "zone_append": false, 00:26:00.578 "compare": false, 00:26:00.578 "compare_and_write": false, 00:26:00.578 "abort": true, 00:26:00.578 "seek_hole": false, 00:26:00.578 "seek_data": false, 00:26:00.578 "copy": true, 00:26:00.578 "nvme_iov_md": false 00:26:00.578 }, 00:26:00.578 "memory_domains": [ 00:26:00.578 { 00:26:00.578 "dma_device_id": "system", 00:26:00.578 "dma_device_type": 1 00:26:00.578 }, 00:26:00.578 { 00:26:00.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.578 "dma_device_type": 2 00:26:00.578 } 00:26:00.578 ], 00:26:00.578 "driver_specific": {} 00:26:00.578 }' 00:26:00.578 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.578 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:00.578 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:00.578 13:44:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:00.835 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.092 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:01.092 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:01.349 [2024-07-15 13:44:40.774224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.606 13:44:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:01.864 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.864 "name": "Existed_Raid", 00:26:01.864 "uuid": "104544cf-3968-4476-bae7-299ac46ac75d", 00:26:01.864 "strip_size_kb": 0, 00:26:01.864 "state": "online", 00:26:01.864 "raid_level": "raid1", 00:26:01.864 "superblock": true, 00:26:01.864 "num_base_bdevs": 2, 00:26:01.864 "num_base_bdevs_discovered": 1, 00:26:01.864 "num_base_bdevs_operational": 1, 00:26:01.864 "base_bdevs_list": [ 00:26:01.864 { 00:26:01.864 "name": null, 00:26:01.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.864 "is_configured": false, 00:26:01.864 "data_offset": 256, 00:26:01.864 "data_size": 7936 00:26:01.864 }, 00:26:01.864 { 00:26:01.864 "name": "BaseBdev2", 00:26:01.864 "uuid": "71f2639f-df1e-41bf-bda8-e193b8907a7c", 00:26:01.864 "is_configured": true, 00:26:01.864 "data_offset": 256, 00:26:01.864 "data_size": 7936 00:26:01.864 } 00:26:01.864 ] 00:26:01.864 }' 00:26:01.864 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.864 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:02.428 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:02.428 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:02.428 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:02.428 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.685 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:02.685 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:02.685 13:44:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:02.943 [2024-07-15 13:44:42.110802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:02.943 [2024-07-15 13:44:42.110886] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:02.943 [2024-07-15 13:44:42.121766] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:02.943 [2024-07-15 13:44:42.121803] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:02.943 [2024-07-15 13:44:42.121815] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa71000 name Existed_Raid, state offline 00:26:02.943 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:02.943 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:02.943 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.943 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2206058 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2206058 ']' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2206058 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2206058 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2206058' 00:26:03.509 killing process with pid 2206058 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2206058 00:26:03.509 [2024-07-15 13:44:42.700576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2206058 00:26:03.509 [2024-07-15 13:44:42.701448] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:26:03.509 00:26:03.509 real 0m11.137s 00:26:03.509 user 0m19.807s 00:26:03.509 sys 0m2.090s 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:03.509 13:44:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:03.509 ************************************ 00:26:03.509 END TEST raid_state_function_test_sb_4k 00:26:03.509 ************************************ 00:26:03.768 13:44:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:03.768 13:44:42 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:26:03.768 13:44:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:03.768 13:44:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:03.768 13:44:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:03.768 ************************************ 00:26:03.768 START TEST raid_superblock_test_4k 00:26:03.768 ************************************ 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2207683 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2207683 /var/tmp/spdk-raid.sock 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2207683 ']' 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:03.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:03.768 13:44:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:03.768 [2024-07-15 13:44:43.031595] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:26:03.768 [2024-07-15 13:44:43.031658] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2207683 ] 00:26:03.768 [2024-07-15 13:44:43.160902] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:04.026 [2024-07-15 13:44:43.267550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:04.026 [2024-07-15 13:44:43.343474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:04.026 [2024-07-15 13:44:43.343511] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:04.590 13:44:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:26:04.848 malloc1 00:26:04.848 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:05.105 [2024-07-15 13:44:44.334499] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:05.105 [2024-07-15 13:44:44.334546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.105 [2024-07-15 13:44:44.334568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e4570 00:26:05.105 [2024-07-15 13:44:44.334581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.105 [2024-07-15 13:44:44.336285] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.105 [2024-07-15 13:44:44.336314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:05.105 pt1 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:05.105 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:26:05.363 malloc2 00:26:05.363 13:44:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:05.928 [2024-07-15 13:44:45.073239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:05.928 [2024-07-15 13:44:45.073289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.928 [2024-07-15 13:44:45.073308] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e5970 00:26:05.928 [2024-07-15 13:44:45.073320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.928 [2024-07-15 13:44:45.074987] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.928 [2024-07-15 13:44:45.075016] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:05.928 pt2 00:26:05.928 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:05.928 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:05.928 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:05.928 [2024-07-15 13:44:45.321910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:05.928 [2024-07-15 13:44:45.323286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:05.928 [2024-07-15 13:44:45.323440] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2788270 00:26:05.928 [2024-07-15 13:44:45.323453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:05.928 [2024-07-15 13:44:45.323659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25dc0e0 00:26:05.928 [2024-07-15 13:44:45.323810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2788270 00:26:05.928 [2024-07-15 13:44:45.323821] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2788270 00:26:05.928 [2024-07-15 13:44:45.323923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.929 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.187 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.187 "name": "raid_bdev1", 00:26:06.187 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:06.187 "strip_size_kb": 0, 00:26:06.187 "state": "online", 00:26:06.187 "raid_level": "raid1", 00:26:06.187 "superblock": true, 00:26:06.187 "num_base_bdevs": 2, 00:26:06.187 "num_base_bdevs_discovered": 2, 00:26:06.187 "num_base_bdevs_operational": 2, 00:26:06.187 "base_bdevs_list": [ 00:26:06.187 { 00:26:06.187 "name": "pt1", 00:26:06.187 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:06.187 "is_configured": true, 00:26:06.187 "data_offset": 256, 00:26:06.187 "data_size": 7936 00:26:06.187 }, 00:26:06.187 { 00:26:06.187 "name": "pt2", 00:26:06.187 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:06.187 "is_configured": true, 00:26:06.187 "data_offset": 256, 00:26:06.187 "data_size": 7936 00:26:06.187 } 00:26:06.187 ] 00:26:06.187 }' 00:26:06.187 13:44:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.187 13:44:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:06.752 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:06.752 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:06.752 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:06.752 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:06.752 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:06.753 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:06.753 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:06.753 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:07.011 [2024-07-15 13:44:46.292697] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:07.011 "name": "raid_bdev1", 00:26:07.011 "aliases": [ 00:26:07.011 "72dc0c3f-b938-417c-aec9-43f8df978a48" 00:26:07.011 ], 00:26:07.011 "product_name": "Raid Volume", 00:26:07.011 "block_size": 4096, 00:26:07.011 "num_blocks": 7936, 00:26:07.011 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:07.011 "assigned_rate_limits": { 00:26:07.011 "rw_ios_per_sec": 0, 00:26:07.011 "rw_mbytes_per_sec": 0, 00:26:07.011 "r_mbytes_per_sec": 0, 00:26:07.011 "w_mbytes_per_sec": 0 00:26:07.011 }, 00:26:07.011 "claimed": false, 00:26:07.011 "zoned": false, 00:26:07.011 "supported_io_types": { 00:26:07.011 "read": true, 00:26:07.011 "write": true, 00:26:07.011 "unmap": false, 00:26:07.011 "flush": false, 00:26:07.011 "reset": true, 00:26:07.011 "nvme_admin": false, 00:26:07.011 "nvme_io": false, 00:26:07.011 "nvme_io_md": false, 00:26:07.011 "write_zeroes": true, 00:26:07.011 "zcopy": false, 00:26:07.011 "get_zone_info": false, 00:26:07.011 "zone_management": false, 00:26:07.011 "zone_append": false, 00:26:07.011 "compare": false, 00:26:07.011 "compare_and_write": false, 00:26:07.011 "abort": false, 00:26:07.011 "seek_hole": false, 00:26:07.011 "seek_data": false, 00:26:07.011 "copy": false, 00:26:07.011 "nvme_iov_md": false 00:26:07.011 }, 00:26:07.011 "memory_domains": [ 00:26:07.011 { 00:26:07.011 "dma_device_id": "system", 00:26:07.011 "dma_device_type": 1 00:26:07.011 }, 00:26:07.011 { 00:26:07.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.011 "dma_device_type": 2 00:26:07.011 }, 00:26:07.011 { 00:26:07.011 "dma_device_id": "system", 00:26:07.011 "dma_device_type": 1 00:26:07.011 }, 00:26:07.011 { 00:26:07.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.011 "dma_device_type": 2 00:26:07.011 } 00:26:07.011 ], 00:26:07.011 "driver_specific": { 00:26:07.011 "raid": { 00:26:07.011 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:07.011 "strip_size_kb": 0, 00:26:07.011 "state": "online", 00:26:07.011 "raid_level": "raid1", 00:26:07.011 "superblock": true, 00:26:07.011 "num_base_bdevs": 2, 00:26:07.011 "num_base_bdevs_discovered": 2, 00:26:07.011 "num_base_bdevs_operational": 2, 00:26:07.011 "base_bdevs_list": [ 00:26:07.011 { 00:26:07.011 "name": "pt1", 00:26:07.011 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:07.011 "is_configured": true, 00:26:07.011 "data_offset": 256, 00:26:07.011 "data_size": 7936 00:26:07.011 }, 00:26:07.011 { 00:26:07.011 "name": "pt2", 00:26:07.011 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:07.011 "is_configured": true, 00:26:07.011 "data_offset": 256, 00:26:07.011 "data_size": 7936 00:26:07.011 } 00:26:07.011 ] 00:26:07.011 } 00:26:07.011 } 00:26:07.011 }' 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:07.011 pt2' 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:07.011 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:07.270 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:07.270 "name": "pt1", 00:26:07.270 "aliases": [ 00:26:07.270 "00000000-0000-0000-0000-000000000001" 00:26:07.270 ], 00:26:07.270 "product_name": "passthru", 00:26:07.270 "block_size": 4096, 00:26:07.270 "num_blocks": 8192, 00:26:07.270 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:07.270 "assigned_rate_limits": { 00:26:07.270 "rw_ios_per_sec": 0, 00:26:07.270 "rw_mbytes_per_sec": 0, 00:26:07.270 "r_mbytes_per_sec": 0, 00:26:07.270 "w_mbytes_per_sec": 0 00:26:07.270 }, 00:26:07.270 "claimed": true, 00:26:07.270 "claim_type": "exclusive_write", 00:26:07.270 "zoned": false, 00:26:07.270 "supported_io_types": { 00:26:07.270 "read": true, 00:26:07.270 "write": true, 00:26:07.270 "unmap": true, 00:26:07.270 "flush": true, 00:26:07.270 "reset": true, 00:26:07.270 "nvme_admin": false, 00:26:07.270 "nvme_io": false, 00:26:07.270 "nvme_io_md": false, 00:26:07.270 "write_zeroes": true, 00:26:07.270 "zcopy": true, 00:26:07.270 "get_zone_info": false, 00:26:07.270 "zone_management": false, 00:26:07.270 "zone_append": false, 00:26:07.270 "compare": false, 00:26:07.270 "compare_and_write": false, 00:26:07.270 "abort": true, 00:26:07.270 "seek_hole": false, 00:26:07.270 "seek_data": false, 00:26:07.270 "copy": true, 00:26:07.270 "nvme_iov_md": false 00:26:07.270 }, 00:26:07.270 "memory_domains": [ 00:26:07.270 { 00:26:07.270 "dma_device_id": "system", 00:26:07.270 "dma_device_type": 1 00:26:07.270 }, 00:26:07.270 { 00:26:07.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.270 "dma_device_type": 2 00:26:07.270 } 00:26:07.270 ], 00:26:07.270 "driver_specific": { 00:26:07.270 "passthru": { 00:26:07.270 "name": "pt1", 00:26:07.270 "base_bdev_name": "malloc1" 00:26:07.270 } 00:26:07.270 } 00:26:07.270 }' 00:26:07.270 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:07.270 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:07.529 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:07.787 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:07.787 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:07.787 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:07.787 13:44:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:08.045 "name": "pt2", 00:26:08.045 "aliases": [ 00:26:08.045 "00000000-0000-0000-0000-000000000002" 00:26:08.045 ], 00:26:08.045 "product_name": "passthru", 00:26:08.045 "block_size": 4096, 00:26:08.045 "num_blocks": 8192, 00:26:08.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:08.045 "assigned_rate_limits": { 00:26:08.045 "rw_ios_per_sec": 0, 00:26:08.045 "rw_mbytes_per_sec": 0, 00:26:08.045 "r_mbytes_per_sec": 0, 00:26:08.045 "w_mbytes_per_sec": 0 00:26:08.045 }, 00:26:08.045 "claimed": true, 00:26:08.045 "claim_type": "exclusive_write", 00:26:08.045 "zoned": false, 00:26:08.045 "supported_io_types": { 00:26:08.045 "read": true, 00:26:08.045 "write": true, 00:26:08.045 "unmap": true, 00:26:08.045 "flush": true, 00:26:08.045 "reset": true, 00:26:08.045 "nvme_admin": false, 00:26:08.045 "nvme_io": false, 00:26:08.045 "nvme_io_md": false, 00:26:08.045 "write_zeroes": true, 00:26:08.045 "zcopy": true, 00:26:08.045 "get_zone_info": false, 00:26:08.045 "zone_management": false, 00:26:08.045 "zone_append": false, 00:26:08.045 "compare": false, 00:26:08.045 "compare_and_write": false, 00:26:08.045 "abort": true, 00:26:08.045 "seek_hole": false, 00:26:08.045 "seek_data": false, 00:26:08.045 "copy": true, 00:26:08.045 "nvme_iov_md": false 00:26:08.045 }, 00:26:08.045 "memory_domains": [ 00:26:08.045 { 00:26:08.045 "dma_device_id": "system", 00:26:08.045 "dma_device_type": 1 00:26:08.045 }, 00:26:08.045 { 00:26:08.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:08.045 "dma_device_type": 2 00:26:08.045 } 00:26:08.045 ], 00:26:08.045 "driver_specific": { 00:26:08.045 "passthru": { 00:26:08.045 "name": "pt2", 00:26:08.045 "base_bdev_name": "malloc2" 00:26:08.045 } 00:26:08.045 } 00:26:08.045 }' 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:08.045 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.303 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.303 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:08.303 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:08.303 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:08.561 [2024-07-15 13:44:47.780619] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:08.561 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=72dc0c3f-b938-417c-aec9-43f8df978a48 00:26:08.561 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 72dc0c3f-b938-417c-aec9-43f8df978a48 ']' 00:26:08.561 13:44:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:08.819 [2024-07-15 13:44:48.025027] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:08.819 [2024-07-15 13:44:48.025048] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:08.819 [2024-07-15 13:44:48.025105] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:08.819 [2024-07-15 13:44:48.025165] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:08.819 [2024-07-15 13:44:48.025177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2788270 name raid_bdev1, state offline 00:26:08.819 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.819 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:09.076 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:09.076 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:09.076 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:09.076 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:09.334 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:09.334 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:09.592 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:09.592 13:44:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:09.851 [2024-07-15 13:44:49.256376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:09.851 [2024-07-15 13:44:49.257769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:09.851 [2024-07-15 13:44:49.257825] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:09.851 [2024-07-15 13:44:49.257866] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:09.851 [2024-07-15 13:44:49.257885] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:09.851 [2024-07-15 13:44:49.257895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2787ff0 name raid_bdev1, state configuring 00:26:09.851 request: 00:26:09.851 { 00:26:09.851 "name": "raid_bdev1", 00:26:09.851 "raid_level": "raid1", 00:26:09.851 "base_bdevs": [ 00:26:09.851 "malloc1", 00:26:09.851 "malloc2" 00:26:09.851 ], 00:26:09.851 "superblock": false, 00:26:09.851 "method": "bdev_raid_create", 00:26:09.851 "req_id": 1 00:26:09.851 } 00:26:09.851 Got JSON-RPC error response 00:26:09.851 response: 00:26:09.851 { 00:26:09.851 "code": -17, 00:26:09.851 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:09.851 } 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:09.851 13:44:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:10.110 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.110 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:10.110 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:10.110 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:10.110 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:10.368 [2024-07-15 13:44:49.749622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:10.368 [2024-07-15 13:44:49.749668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:10.368 [2024-07-15 13:44:49.749691] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e47a0 00:26:10.368 [2024-07-15 13:44:49.749704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:10.368 [2024-07-15 13:44:49.751339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:10.368 [2024-07-15 13:44:49.751366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:10.368 [2024-07-15 13:44:49.751432] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:10.368 [2024-07-15 13:44:49.751458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:10.368 pt1 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.368 13:44:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.627 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.627 "name": "raid_bdev1", 00:26:10.627 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:10.627 "strip_size_kb": 0, 00:26:10.627 "state": "configuring", 00:26:10.627 "raid_level": "raid1", 00:26:10.627 "superblock": true, 00:26:10.627 "num_base_bdevs": 2, 00:26:10.627 "num_base_bdevs_discovered": 1, 00:26:10.627 "num_base_bdevs_operational": 2, 00:26:10.627 "base_bdevs_list": [ 00:26:10.627 { 00:26:10.627 "name": "pt1", 00:26:10.627 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:10.627 "is_configured": true, 00:26:10.627 "data_offset": 256, 00:26:10.627 "data_size": 7936 00:26:10.627 }, 00:26:10.627 { 00:26:10.627 "name": null, 00:26:10.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:10.627 "is_configured": false, 00:26:10.627 "data_offset": 256, 00:26:10.627 "data_size": 7936 00:26:10.627 } 00:26:10.627 ] 00:26:10.627 }' 00:26:10.627 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.627 13:44:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:11.193 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:11.193 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:11.193 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:11.193 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:11.462 [2024-07-15 13:44:50.808438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:11.462 [2024-07-15 13:44:50.808495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:11.462 [2024-07-15 13:44:50.808515] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277c6f0 00:26:11.462 [2024-07-15 13:44:50.808527] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:11.462 [2024-07-15 13:44:50.808882] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:11.462 [2024-07-15 13:44:50.808899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:11.462 [2024-07-15 13:44:50.808978] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:11.462 [2024-07-15 13:44:50.808999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:11.462 [2024-07-15 13:44:50.809103] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x277d590 00:26:11.462 [2024-07-15 13:44:50.809114] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:11.462 [2024-07-15 13:44:50.809281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25de540 00:26:11.462 [2024-07-15 13:44:50.809407] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x277d590 00:26:11.462 [2024-07-15 13:44:50.809418] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x277d590 00:26:11.462 [2024-07-15 13:44:50.809513] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.462 pt2 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.463 13:44:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.735 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.735 "name": "raid_bdev1", 00:26:11.735 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:11.735 "strip_size_kb": 0, 00:26:11.735 "state": "online", 00:26:11.735 "raid_level": "raid1", 00:26:11.735 "superblock": true, 00:26:11.735 "num_base_bdevs": 2, 00:26:11.735 "num_base_bdevs_discovered": 2, 00:26:11.735 "num_base_bdevs_operational": 2, 00:26:11.735 "base_bdevs_list": [ 00:26:11.735 { 00:26:11.735 "name": "pt1", 00:26:11.735 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:11.735 "is_configured": true, 00:26:11.735 "data_offset": 256, 00:26:11.735 "data_size": 7936 00:26:11.735 }, 00:26:11.735 { 00:26:11.735 "name": "pt2", 00:26:11.735 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:11.735 "is_configured": true, 00:26:11.735 "data_offset": 256, 00:26:11.735 "data_size": 7936 00:26:11.735 } 00:26:11.735 ] 00:26:11.735 }' 00:26:11.735 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.735 13:44:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:12.301 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:12.559 [2024-07-15 13:44:51.899571] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:12.560 "name": "raid_bdev1", 00:26:12.560 "aliases": [ 00:26:12.560 "72dc0c3f-b938-417c-aec9-43f8df978a48" 00:26:12.560 ], 00:26:12.560 "product_name": "Raid Volume", 00:26:12.560 "block_size": 4096, 00:26:12.560 "num_blocks": 7936, 00:26:12.560 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:12.560 "assigned_rate_limits": { 00:26:12.560 "rw_ios_per_sec": 0, 00:26:12.560 "rw_mbytes_per_sec": 0, 00:26:12.560 "r_mbytes_per_sec": 0, 00:26:12.560 "w_mbytes_per_sec": 0 00:26:12.560 }, 00:26:12.560 "claimed": false, 00:26:12.560 "zoned": false, 00:26:12.560 "supported_io_types": { 00:26:12.560 "read": true, 00:26:12.560 "write": true, 00:26:12.560 "unmap": false, 00:26:12.560 "flush": false, 00:26:12.560 "reset": true, 00:26:12.560 "nvme_admin": false, 00:26:12.560 "nvme_io": false, 00:26:12.560 "nvme_io_md": false, 00:26:12.560 "write_zeroes": true, 00:26:12.560 "zcopy": false, 00:26:12.560 "get_zone_info": false, 00:26:12.560 "zone_management": false, 00:26:12.560 "zone_append": false, 00:26:12.560 "compare": false, 00:26:12.560 "compare_and_write": false, 00:26:12.560 "abort": false, 00:26:12.560 "seek_hole": false, 00:26:12.560 "seek_data": false, 00:26:12.560 "copy": false, 00:26:12.560 "nvme_iov_md": false 00:26:12.560 }, 00:26:12.560 "memory_domains": [ 00:26:12.560 { 00:26:12.560 "dma_device_id": "system", 00:26:12.560 "dma_device_type": 1 00:26:12.560 }, 00:26:12.560 { 00:26:12.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.560 "dma_device_type": 2 00:26:12.560 }, 00:26:12.560 { 00:26:12.560 "dma_device_id": "system", 00:26:12.560 "dma_device_type": 1 00:26:12.560 }, 00:26:12.560 { 00:26:12.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.560 "dma_device_type": 2 00:26:12.560 } 00:26:12.560 ], 00:26:12.560 "driver_specific": { 00:26:12.560 "raid": { 00:26:12.560 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:12.560 "strip_size_kb": 0, 00:26:12.560 "state": "online", 00:26:12.560 "raid_level": "raid1", 00:26:12.560 "superblock": true, 00:26:12.560 "num_base_bdevs": 2, 00:26:12.560 "num_base_bdevs_discovered": 2, 00:26:12.560 "num_base_bdevs_operational": 2, 00:26:12.560 "base_bdevs_list": [ 00:26:12.560 { 00:26:12.560 "name": "pt1", 00:26:12.560 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:12.560 "is_configured": true, 00:26:12.560 "data_offset": 256, 00:26:12.560 "data_size": 7936 00:26:12.560 }, 00:26:12.560 { 00:26:12.560 "name": "pt2", 00:26:12.560 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:12.560 "is_configured": true, 00:26:12.560 "data_offset": 256, 00:26:12.560 "data_size": 7936 00:26:12.560 } 00:26:12.560 ] 00:26:12.560 } 00:26:12.560 } 00:26:12.560 }' 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:12.560 pt2' 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:12.560 13:44:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:12.818 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:12.818 "name": "pt1", 00:26:12.818 "aliases": [ 00:26:12.818 "00000000-0000-0000-0000-000000000001" 00:26:12.818 ], 00:26:12.818 "product_name": "passthru", 00:26:12.818 "block_size": 4096, 00:26:12.818 "num_blocks": 8192, 00:26:12.818 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:12.818 "assigned_rate_limits": { 00:26:12.818 "rw_ios_per_sec": 0, 00:26:12.818 "rw_mbytes_per_sec": 0, 00:26:12.818 "r_mbytes_per_sec": 0, 00:26:12.818 "w_mbytes_per_sec": 0 00:26:12.818 }, 00:26:12.818 "claimed": true, 00:26:12.818 "claim_type": "exclusive_write", 00:26:12.818 "zoned": false, 00:26:12.818 "supported_io_types": { 00:26:12.818 "read": true, 00:26:12.818 "write": true, 00:26:12.818 "unmap": true, 00:26:12.818 "flush": true, 00:26:12.818 "reset": true, 00:26:12.818 "nvme_admin": false, 00:26:12.818 "nvme_io": false, 00:26:12.818 "nvme_io_md": false, 00:26:12.818 "write_zeroes": true, 00:26:12.818 "zcopy": true, 00:26:12.818 "get_zone_info": false, 00:26:12.818 "zone_management": false, 00:26:12.818 "zone_append": false, 00:26:12.818 "compare": false, 00:26:12.818 "compare_and_write": false, 00:26:12.818 "abort": true, 00:26:12.818 "seek_hole": false, 00:26:12.818 "seek_data": false, 00:26:12.818 "copy": true, 00:26:12.818 "nvme_iov_md": false 00:26:12.818 }, 00:26:12.818 "memory_domains": [ 00:26:12.818 { 00:26:12.818 "dma_device_id": "system", 00:26:12.818 "dma_device_type": 1 00:26:12.818 }, 00:26:12.818 { 00:26:12.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.818 "dma_device_type": 2 00:26:12.818 } 00:26:12.818 ], 00:26:12.818 "driver_specific": { 00:26:12.818 "passthru": { 00:26:12.818 "name": "pt1", 00:26:12.818 "base_bdev_name": "malloc1" 00:26:12.818 } 00:26:12.818 } 00:26:12.818 }' 00:26:12.818 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:13.076 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.334 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.334 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:13.334 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:13.335 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:13.335 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:13.592 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:13.592 "name": "pt2", 00:26:13.592 "aliases": [ 00:26:13.592 "00000000-0000-0000-0000-000000000002" 00:26:13.592 ], 00:26:13.593 "product_name": "passthru", 00:26:13.593 "block_size": 4096, 00:26:13.593 "num_blocks": 8192, 00:26:13.593 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:13.593 "assigned_rate_limits": { 00:26:13.593 "rw_ios_per_sec": 0, 00:26:13.593 "rw_mbytes_per_sec": 0, 00:26:13.593 "r_mbytes_per_sec": 0, 00:26:13.593 "w_mbytes_per_sec": 0 00:26:13.593 }, 00:26:13.593 "claimed": true, 00:26:13.593 "claim_type": "exclusive_write", 00:26:13.593 "zoned": false, 00:26:13.593 "supported_io_types": { 00:26:13.593 "read": true, 00:26:13.593 "write": true, 00:26:13.593 "unmap": true, 00:26:13.593 "flush": true, 00:26:13.593 "reset": true, 00:26:13.593 "nvme_admin": false, 00:26:13.593 "nvme_io": false, 00:26:13.593 "nvme_io_md": false, 00:26:13.593 "write_zeroes": true, 00:26:13.593 "zcopy": true, 00:26:13.593 "get_zone_info": false, 00:26:13.593 "zone_management": false, 00:26:13.593 "zone_append": false, 00:26:13.593 "compare": false, 00:26:13.593 "compare_and_write": false, 00:26:13.593 "abort": true, 00:26:13.593 "seek_hole": false, 00:26:13.593 "seek_data": false, 00:26:13.593 "copy": true, 00:26:13.593 "nvme_iov_md": false 00:26:13.593 }, 00:26:13.593 "memory_domains": [ 00:26:13.593 { 00:26:13.593 "dma_device_id": "system", 00:26:13.593 "dma_device_type": 1 00:26:13.593 }, 00:26:13.593 { 00:26:13.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:13.593 "dma_device_type": 2 00:26:13.593 } 00:26:13.593 ], 00:26:13.593 "driver_specific": { 00:26:13.593 "passthru": { 00:26:13.593 "name": "pt2", 00:26:13.593 "base_bdev_name": "malloc2" 00:26:13.593 } 00:26:13.593 } 00:26:13.593 }' 00:26:13.593 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.593 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.593 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:13.593 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.593 13:44:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.593 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:13.593 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:13.851 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:14.109 [2024-07-15 13:44:53.415568] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:14.109 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 72dc0c3f-b938-417c-aec9-43f8df978a48 '!=' 72dc0c3f-b938-417c-aec9-43f8df978a48 ']' 00:26:14.109 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:14.109 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:14.109 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:14.109 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:14.367 [2024-07-15 13:44:53.664020] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.367 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.625 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:14.625 "name": "raid_bdev1", 00:26:14.625 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:14.625 "strip_size_kb": 0, 00:26:14.625 "state": "online", 00:26:14.625 "raid_level": "raid1", 00:26:14.625 "superblock": true, 00:26:14.625 "num_base_bdevs": 2, 00:26:14.625 "num_base_bdevs_discovered": 1, 00:26:14.625 "num_base_bdevs_operational": 1, 00:26:14.625 "base_bdevs_list": [ 00:26:14.625 { 00:26:14.625 "name": null, 00:26:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.625 "is_configured": false, 00:26:14.625 "data_offset": 256, 00:26:14.625 "data_size": 7936 00:26:14.625 }, 00:26:14.625 { 00:26:14.625 "name": "pt2", 00:26:14.625 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:14.625 "is_configured": true, 00:26:14.625 "data_offset": 256, 00:26:14.625 "data_size": 7936 00:26:14.625 } 00:26:14.625 ] 00:26:14.625 }' 00:26:14.625 13:44:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:14.625 13:44:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:15.191 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:15.450 [2024-07-15 13:44:54.726798] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:15.450 [2024-07-15 13:44:54.726824] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:15.450 [2024-07-15 13:44:54.726878] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:15.450 [2024-07-15 13:44:54.726921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:15.450 [2024-07-15 13:44:54.726939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x277d590 name raid_bdev1, state offline 00:26:15.450 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:15.450 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.708 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:15.708 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:15.708 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:15.708 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:15.708 13:44:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:26:15.967 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:16.225 [2024-07-15 13:44:55.432637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:16.225 [2024-07-15 13:44:55.432683] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.225 [2024-07-15 13:44:55.432700] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e5160 00:26:16.225 [2024-07-15 13:44:55.432712] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.225 [2024-07-15 13:44:55.434371] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.225 [2024-07-15 13:44:55.434399] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:16.225 [2024-07-15 13:44:55.434464] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:16.225 [2024-07-15 13:44:55.434491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:16.225 [2024-07-15 13:44:55.434576] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25db380 00:26:16.225 [2024-07-15 13:44:55.434587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:16.225 [2024-07-15 13:44:55.434761] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25dca80 00:26:16.225 [2024-07-15 13:44:55.434883] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25db380 00:26:16.225 [2024-07-15 13:44:55.434893] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25db380 00:26:16.225 [2024-07-15 13:44:55.434998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:16.225 pt2 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.225 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.484 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.484 "name": "raid_bdev1", 00:26:16.484 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:16.484 "strip_size_kb": 0, 00:26:16.484 "state": "online", 00:26:16.484 "raid_level": "raid1", 00:26:16.484 "superblock": true, 00:26:16.484 "num_base_bdevs": 2, 00:26:16.484 "num_base_bdevs_discovered": 1, 00:26:16.484 "num_base_bdevs_operational": 1, 00:26:16.484 "base_bdevs_list": [ 00:26:16.484 { 00:26:16.484 "name": null, 00:26:16.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.484 "is_configured": false, 00:26:16.484 "data_offset": 256, 00:26:16.484 "data_size": 7936 00:26:16.484 }, 00:26:16.484 { 00:26:16.484 "name": "pt2", 00:26:16.484 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:16.484 "is_configured": true, 00:26:16.484 "data_offset": 256, 00:26:16.484 "data_size": 7936 00:26:16.484 } 00:26:16.484 ] 00:26:16.484 }' 00:26:16.484 13:44:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.484 13:44:55 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:17.049 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:17.307 [2024-07-15 13:44:56.515503] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:17.307 [2024-07-15 13:44:56.515532] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:17.307 [2024-07-15 13:44:56.515588] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:17.307 [2024-07-15 13:44:56.515633] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:17.307 [2024-07-15 13:44:56.515645] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25db380 name raid_bdev1, state offline 00:26:17.307 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.307 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:17.565 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:17.565 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:17.565 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:17.565 13:44:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:17.824 [2024-07-15 13:44:56.996747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:17.824 [2024-07-15 13:44:56.996792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.824 [2024-07-15 13:44:56.996810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2787520 00:26:17.824 [2024-07-15 13:44:56.996822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.824 [2024-07-15 13:44:56.998472] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.824 [2024-07-15 13:44:56.998502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:17.824 [2024-07-15 13:44:56.998569] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:17.824 [2024-07-15 13:44:56.998596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:17.824 [2024-07-15 13:44:56.998697] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:17.824 [2024-07-15 13:44:56.998710] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:17.824 [2024-07-15 13:44:56.998724] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25dc3f0 name raid_bdev1, state configuring 00:26:17.824 [2024-07-15 13:44:56.998748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:17.824 [2024-07-15 13:44:56.998811] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25de2b0 00:26:17.824 [2024-07-15 13:44:56.998822] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:17.824 [2024-07-15 13:44:56.998996] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25db350 00:26:17.824 [2024-07-15 13:44:56.999121] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25de2b0 00:26:17.824 [2024-07-15 13:44:56.999131] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25de2b0 00:26:17.824 [2024-07-15 13:44:56.999231] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:17.824 pt1 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.824 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.112 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:18.112 "name": "raid_bdev1", 00:26:18.112 "uuid": "72dc0c3f-b938-417c-aec9-43f8df978a48", 00:26:18.112 "strip_size_kb": 0, 00:26:18.112 "state": "online", 00:26:18.112 "raid_level": "raid1", 00:26:18.112 "superblock": true, 00:26:18.112 "num_base_bdevs": 2, 00:26:18.112 "num_base_bdevs_discovered": 1, 00:26:18.112 "num_base_bdevs_operational": 1, 00:26:18.112 "base_bdevs_list": [ 00:26:18.112 { 00:26:18.112 "name": null, 00:26:18.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.112 "is_configured": false, 00:26:18.112 "data_offset": 256, 00:26:18.112 "data_size": 7936 00:26:18.112 }, 00:26:18.112 { 00:26:18.112 "name": "pt2", 00:26:18.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:18.112 "is_configured": true, 00:26:18.112 "data_offset": 256, 00:26:18.112 "data_size": 7936 00:26:18.112 } 00:26:18.112 ] 00:26:18.112 }' 00:26:18.112 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:18.112 13:44:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:18.677 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:18.677 13:44:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:18.935 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:18.935 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:18.935 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:18.935 [2024-07-15 13:44:58.348544] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 72dc0c3f-b938-417c-aec9-43f8df978a48 '!=' 72dc0c3f-b938-417c-aec9-43f8df978a48 ']' 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2207683 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2207683 ']' 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2207683 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2207683 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2207683' 00:26:19.193 killing process with pid 2207683 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2207683 00:26:19.193 [2024-07-15 13:44:58.404959] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:19.193 [2024-07-15 13:44:58.405011] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:19.193 [2024-07-15 13:44:58.405053] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:19.193 [2024-07-15 13:44:58.405065] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25de2b0 name raid_bdev1, state offline 00:26:19.193 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2207683 00:26:19.193 [2024-07-15 13:44:58.421550] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:19.452 13:44:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:26:19.452 00:26:19.452 real 0m15.653s 00:26:19.452 user 0m28.434s 00:26:19.452 sys 0m2.833s 00:26:19.452 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:19.452 13:44:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:19.452 ************************************ 00:26:19.452 END TEST raid_superblock_test_4k 00:26:19.452 ************************************ 00:26:19.452 13:44:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:19.452 13:44:58 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:26:19.452 13:44:58 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:26:19.452 13:44:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:19.452 13:44:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:19.452 13:44:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:19.452 ************************************ 00:26:19.452 START TEST raid_rebuild_test_sb_4k 00:26:19.452 ************************************ 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:19.452 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2209948 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2209948 /var/tmp/spdk-raid.sock 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2209948 ']' 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:19.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:19.453 13:44:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:19.453 [2024-07-15 13:44:58.778923] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:26:19.453 [2024-07-15 13:44:58.778996] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2209948 ] 00:26:19.453 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:19.453 Zero copy mechanism will not be used. 00:26:19.711 [2024-07-15 13:44:58.907650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:19.711 [2024-07-15 13:44:59.014198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:19.711 [2024-07-15 13:44:59.081015] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:19.711 [2024-07-15 13:44:59.081056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:20.646 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:20.646 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:20.646 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:20.646 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:26:20.646 BaseBdev1_malloc 00:26:20.646 13:44:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:20.904 [2024-07-15 13:45:00.175325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:20.904 [2024-07-15 13:45:00.175375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.904 [2024-07-15 13:45:00.175400] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1806d40 00:26:20.904 [2024-07-15 13:45:00.175413] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.904 [2024-07-15 13:45:00.177181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.904 [2024-07-15 13:45:00.177209] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:20.904 BaseBdev1 00:26:20.904 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:20.904 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:26:21.164 BaseBdev2_malloc 00:26:21.164 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:21.423 [2024-07-15 13:45:00.669995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:21.423 [2024-07-15 13:45:00.670045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:21.423 [2024-07-15 13:45:00.670070] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1807860 00:26:21.423 [2024-07-15 13:45:00.670083] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:21.423 [2024-07-15 13:45:00.671667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:21.423 [2024-07-15 13:45:00.671701] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:21.423 BaseBdev2 00:26:21.423 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:21.682 spare_malloc 00:26:21.682 13:45:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:21.940 spare_delay 00:26:21.940 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:22.199 [2024-07-15 13:45:01.401382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:22.199 [2024-07-15 13:45:01.401429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.199 [2024-07-15 13:45:01.401452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b5ec0 00:26:22.199 [2024-07-15 13:45:01.401464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.199 [2024-07-15 13:45:01.403070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.199 [2024-07-15 13:45:01.403099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:22.199 spare 00:26:22.199 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:22.458 [2024-07-15 13:45:01.646068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:22.458 [2024-07-15 13:45:01.647431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:22.458 [2024-07-15 13:45:01.647607] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b7070 00:26:22.458 [2024-07-15 13:45:01.647620] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:22.458 [2024-07-15 13:45:01.647819] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b0490 00:26:22.458 [2024-07-15 13:45:01.647972] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b7070 00:26:22.458 [2024-07-15 13:45:01.647983] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b7070 00:26:22.458 [2024-07-15 13:45:01.648086] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.458 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.717 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.717 "name": "raid_bdev1", 00:26:22.717 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:22.717 "strip_size_kb": 0, 00:26:22.717 "state": "online", 00:26:22.717 "raid_level": "raid1", 00:26:22.717 "superblock": true, 00:26:22.717 "num_base_bdevs": 2, 00:26:22.717 "num_base_bdevs_discovered": 2, 00:26:22.717 "num_base_bdevs_operational": 2, 00:26:22.717 "base_bdevs_list": [ 00:26:22.717 { 00:26:22.717 "name": "BaseBdev1", 00:26:22.717 "uuid": "6cc05ee6-1049-5a46-8d7a-2d07bacd8d0e", 00:26:22.717 "is_configured": true, 00:26:22.717 "data_offset": 256, 00:26:22.717 "data_size": 7936 00:26:22.717 }, 00:26:22.717 { 00:26:22.717 "name": "BaseBdev2", 00:26:22.717 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:22.717 "is_configured": true, 00:26:22.717 "data_offset": 256, 00:26:22.717 "data_size": 7936 00:26:22.717 } 00:26:22.717 ] 00:26:22.717 }' 00:26:22.717 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.717 13:45:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:23.284 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:23.284 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:23.284 [2024-07-15 13:45:02.660988] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:23.284 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:23.284 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.284 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:23.542 13:45:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:23.800 [2024-07-15 13:45:03.154095] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b0490 00:26:23.800 /dev/nbd0 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:23.800 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:24.059 1+0 records in 00:26:24.059 1+0 records out 00:26:24.059 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304442 s, 13.5 MB/s 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:24.059 13:45:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:24.996 7936+0 records in 00:26:24.996 7936+0 records out 00:26:24.996 32505856 bytes (33 MB, 31 MiB) copied, 0.748049 s, 43.5 MB/s 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:24.996 [2024-07-15 13:45:04.337319] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:24.996 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:25.255 [2024-07-15 13:45:04.569993] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.255 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.256 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.256 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.515 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.515 "name": "raid_bdev1", 00:26:25.515 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:25.515 "strip_size_kb": 0, 00:26:25.515 "state": "online", 00:26:25.515 "raid_level": "raid1", 00:26:25.515 "superblock": true, 00:26:25.515 "num_base_bdevs": 2, 00:26:25.515 "num_base_bdevs_discovered": 1, 00:26:25.515 "num_base_bdevs_operational": 1, 00:26:25.515 "base_bdevs_list": [ 00:26:25.515 { 00:26:25.515 "name": null, 00:26:25.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.515 "is_configured": false, 00:26:25.515 "data_offset": 256, 00:26:25.515 "data_size": 7936 00:26:25.515 }, 00:26:25.515 { 00:26:25.515 "name": "BaseBdev2", 00:26:25.515 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:25.515 "is_configured": true, 00:26:25.515 "data_offset": 256, 00:26:25.515 "data_size": 7936 00:26:25.515 } 00:26:25.515 ] 00:26:25.515 }' 00:26:25.515 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.515 13:45:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:26.110 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:26.369 [2024-07-15 13:45:05.584689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:26.369 [2024-07-15 13:45:05.589626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b6ce0 00:26:26.369 [2024-07-15 13:45:05.591835] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:26.369 13:45:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.306 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.564 "name": "raid_bdev1", 00:26:27.564 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:27.564 "strip_size_kb": 0, 00:26:27.564 "state": "online", 00:26:27.564 "raid_level": "raid1", 00:26:27.564 "superblock": true, 00:26:27.564 "num_base_bdevs": 2, 00:26:27.564 "num_base_bdevs_discovered": 2, 00:26:27.564 "num_base_bdevs_operational": 2, 00:26:27.564 "process": { 00:26:27.564 "type": "rebuild", 00:26:27.564 "target": "spare", 00:26:27.564 "progress": { 00:26:27.564 "blocks": 2816, 00:26:27.564 "percent": 35 00:26:27.564 } 00:26:27.564 }, 00:26:27.564 "base_bdevs_list": [ 00:26:27.564 { 00:26:27.564 "name": "spare", 00:26:27.564 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:27.564 "is_configured": true, 00:26:27.564 "data_offset": 256, 00:26:27.564 "data_size": 7936 00:26:27.564 }, 00:26:27.564 { 00:26:27.564 "name": "BaseBdev2", 00:26:27.564 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:27.564 "is_configured": true, 00:26:27.564 "data_offset": 256, 00:26:27.564 "data_size": 7936 00:26:27.564 } 00:26:27.564 ] 00:26:27.564 }' 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.564 13:45:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:27.823 [2024-07-15 13:45:07.094254] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:27.823 [2024-07-15 13:45:07.103605] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:27.823 [2024-07-15 13:45:07.103651] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:27.823 [2024-07-15 13:45:07.103666] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:27.823 [2024-07-15 13:45:07.103675] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.823 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.824 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.824 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.082 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:28.082 "name": "raid_bdev1", 00:26:28.082 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:28.082 "strip_size_kb": 0, 00:26:28.082 "state": "online", 00:26:28.082 "raid_level": "raid1", 00:26:28.082 "superblock": true, 00:26:28.082 "num_base_bdevs": 2, 00:26:28.082 "num_base_bdevs_discovered": 1, 00:26:28.082 "num_base_bdevs_operational": 1, 00:26:28.082 "base_bdevs_list": [ 00:26:28.082 { 00:26:28.082 "name": null, 00:26:28.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.082 "is_configured": false, 00:26:28.082 "data_offset": 256, 00:26:28.082 "data_size": 7936 00:26:28.082 }, 00:26:28.082 { 00:26:28.082 "name": "BaseBdev2", 00:26:28.082 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:28.082 "is_configured": true, 00:26:28.082 "data_offset": 256, 00:26:28.082 "data_size": 7936 00:26:28.082 } 00:26:28.082 ] 00:26:28.082 }' 00:26:28.082 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:28.082 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.649 13:45:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.908 "name": "raid_bdev1", 00:26:28.908 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:28.908 "strip_size_kb": 0, 00:26:28.908 "state": "online", 00:26:28.908 "raid_level": "raid1", 00:26:28.908 "superblock": true, 00:26:28.908 "num_base_bdevs": 2, 00:26:28.908 "num_base_bdevs_discovered": 1, 00:26:28.908 "num_base_bdevs_operational": 1, 00:26:28.908 "base_bdevs_list": [ 00:26:28.908 { 00:26:28.908 "name": null, 00:26:28.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.908 "is_configured": false, 00:26:28.908 "data_offset": 256, 00:26:28.908 "data_size": 7936 00:26:28.908 }, 00:26:28.908 { 00:26:28.908 "name": "BaseBdev2", 00:26:28.908 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:28.908 "is_configured": true, 00:26:28.908 "data_offset": 256, 00:26:28.908 "data_size": 7936 00:26:28.908 } 00:26:28.908 ] 00:26:28.908 }' 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:28.908 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:29.166 [2024-07-15 13:45:08.515691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:29.166 [2024-07-15 13:45:08.520630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19b6ce0 00:26:29.166 [2024-07-15 13:45:08.522099] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:29.166 13:45:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.543 "name": "raid_bdev1", 00:26:30.543 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:30.543 "strip_size_kb": 0, 00:26:30.543 "state": "online", 00:26:30.543 "raid_level": "raid1", 00:26:30.543 "superblock": true, 00:26:30.543 "num_base_bdevs": 2, 00:26:30.543 "num_base_bdevs_discovered": 2, 00:26:30.543 "num_base_bdevs_operational": 2, 00:26:30.543 "process": { 00:26:30.543 "type": "rebuild", 00:26:30.543 "target": "spare", 00:26:30.543 "progress": { 00:26:30.543 "blocks": 3072, 00:26:30.543 "percent": 38 00:26:30.543 } 00:26:30.543 }, 00:26:30.543 "base_bdevs_list": [ 00:26:30.543 { 00:26:30.543 "name": "spare", 00:26:30.543 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:30.543 "is_configured": true, 00:26:30.543 "data_offset": 256, 00:26:30.543 "data_size": 7936 00:26:30.543 }, 00:26:30.543 { 00:26:30.543 "name": "BaseBdev2", 00:26:30.543 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:30.543 "is_configured": true, 00:26:30.543 "data_offset": 256, 00:26:30.543 "data_size": 7936 00:26:30.543 } 00:26:30.543 ] 00:26:30.543 }' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:30.543 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1013 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.543 13:45:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.802 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.802 "name": "raid_bdev1", 00:26:30.802 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:30.802 "strip_size_kb": 0, 00:26:30.802 "state": "online", 00:26:30.802 "raid_level": "raid1", 00:26:30.802 "superblock": true, 00:26:30.802 "num_base_bdevs": 2, 00:26:30.802 "num_base_bdevs_discovered": 2, 00:26:30.802 "num_base_bdevs_operational": 2, 00:26:30.802 "process": { 00:26:30.802 "type": "rebuild", 00:26:30.802 "target": "spare", 00:26:30.802 "progress": { 00:26:30.802 "blocks": 3840, 00:26:30.802 "percent": 48 00:26:30.802 } 00:26:30.802 }, 00:26:30.802 "base_bdevs_list": [ 00:26:30.802 { 00:26:30.802 "name": "spare", 00:26:30.802 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:30.802 "is_configured": true, 00:26:30.802 "data_offset": 256, 00:26:30.802 "data_size": 7936 00:26:30.802 }, 00:26:30.802 { 00:26:30.802 "name": "BaseBdev2", 00:26:30.802 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:30.802 "is_configured": true, 00:26:30.802 "data_offset": 256, 00:26:30.802 "data_size": 7936 00:26:30.802 } 00:26:30.802 ] 00:26:30.802 }' 00:26:30.802 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.802 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:30.802 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.060 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.060 13:45:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.991 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.249 "name": "raid_bdev1", 00:26:32.249 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:32.249 "strip_size_kb": 0, 00:26:32.249 "state": "online", 00:26:32.249 "raid_level": "raid1", 00:26:32.249 "superblock": true, 00:26:32.249 "num_base_bdevs": 2, 00:26:32.249 "num_base_bdevs_discovered": 2, 00:26:32.249 "num_base_bdevs_operational": 2, 00:26:32.249 "process": { 00:26:32.249 "type": "rebuild", 00:26:32.249 "target": "spare", 00:26:32.249 "progress": { 00:26:32.249 "blocks": 7424, 00:26:32.249 "percent": 93 00:26:32.249 } 00:26:32.249 }, 00:26:32.249 "base_bdevs_list": [ 00:26:32.249 { 00:26:32.249 "name": "spare", 00:26:32.249 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:32.249 "is_configured": true, 00:26:32.249 "data_offset": 256, 00:26:32.249 "data_size": 7936 00:26:32.249 }, 00:26:32.249 { 00:26:32.249 "name": "BaseBdev2", 00:26:32.249 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:32.249 "is_configured": true, 00:26:32.249 "data_offset": 256, 00:26:32.249 "data_size": 7936 00:26:32.249 } 00:26:32.249 ] 00:26:32.249 }' 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:32.249 13:45:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:32.249 [2024-07-15 13:45:11.645880] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:32.249 [2024-07-15 13:45:11.645949] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:32.249 [2024-07-15 13:45:11.646033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.183 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:33.749 "name": "raid_bdev1", 00:26:33.749 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:33.749 "strip_size_kb": 0, 00:26:33.749 "state": "online", 00:26:33.749 "raid_level": "raid1", 00:26:33.749 "superblock": true, 00:26:33.749 "num_base_bdevs": 2, 00:26:33.749 "num_base_bdevs_discovered": 2, 00:26:33.749 "num_base_bdevs_operational": 2, 00:26:33.749 "base_bdevs_list": [ 00:26:33.749 { 00:26:33.749 "name": "spare", 00:26:33.749 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:33.749 "is_configured": true, 00:26:33.749 "data_offset": 256, 00:26:33.749 "data_size": 7936 00:26:33.749 }, 00:26:33.749 { 00:26:33.749 "name": "BaseBdev2", 00:26:33.749 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:33.749 "is_configured": true, 00:26:33.749 "data_offset": 256, 00:26:33.749 "data_size": 7936 00:26:33.749 } 00:26:33.749 ] 00:26:33.749 }' 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.749 13:45:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:34.007 "name": "raid_bdev1", 00:26:34.007 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:34.007 "strip_size_kb": 0, 00:26:34.007 "state": "online", 00:26:34.007 "raid_level": "raid1", 00:26:34.007 "superblock": true, 00:26:34.007 "num_base_bdevs": 2, 00:26:34.007 "num_base_bdevs_discovered": 2, 00:26:34.007 "num_base_bdevs_operational": 2, 00:26:34.007 "base_bdevs_list": [ 00:26:34.007 { 00:26:34.007 "name": "spare", 00:26:34.007 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:34.007 "is_configured": true, 00:26:34.007 "data_offset": 256, 00:26:34.007 "data_size": 7936 00:26:34.007 }, 00:26:34.007 { 00:26:34.007 "name": "BaseBdev2", 00:26:34.007 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:34.007 "is_configured": true, 00:26:34.007 "data_offset": 256, 00:26:34.007 "data_size": 7936 00:26:34.007 } 00:26:34.007 ] 00:26:34.007 }' 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.007 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.008 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.266 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.266 "name": "raid_bdev1", 00:26:34.266 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:34.266 "strip_size_kb": 0, 00:26:34.266 "state": "online", 00:26:34.266 "raid_level": "raid1", 00:26:34.266 "superblock": true, 00:26:34.266 "num_base_bdevs": 2, 00:26:34.266 "num_base_bdevs_discovered": 2, 00:26:34.266 "num_base_bdevs_operational": 2, 00:26:34.266 "base_bdevs_list": [ 00:26:34.266 { 00:26:34.266 "name": "spare", 00:26:34.266 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:34.266 "is_configured": true, 00:26:34.266 "data_offset": 256, 00:26:34.266 "data_size": 7936 00:26:34.266 }, 00:26:34.266 { 00:26:34.266 "name": "BaseBdev2", 00:26:34.266 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:34.266 "is_configured": true, 00:26:34.266 "data_offset": 256, 00:26:34.266 "data_size": 7936 00:26:34.266 } 00:26:34.266 ] 00:26:34.266 }' 00:26:34.266 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.266 13:45:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:34.833 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:35.091 [2024-07-15 13:45:14.374478] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:35.091 [2024-07-15 13:45:14.374507] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:35.091 [2024-07-15 13:45:14.374567] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:35.091 [2024-07-15 13:45:14.374626] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:35.091 [2024-07-15 13:45:14.374638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b7070 name raid_bdev1, state offline 00:26:35.091 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.091 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:35.350 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:35.609 /dev/nbd0 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.609 1+0 records in 00:26:35.609 1+0 records out 00:26:35.609 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235452 s, 17.4 MB/s 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:35.609 13:45:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:35.867 /dev/nbd1 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:35.867 1+0 records in 00:26:35.867 1+0 records out 00:26:35.867 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305059 s, 13.4 MB/s 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:35.867 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.126 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:36.384 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:36.642 13:45:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:36.901 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:37.160 [2024-07-15 13:45:16.333963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:37.160 [2024-07-15 13:45:16.334004] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.160 [2024-07-15 13:45:16.334026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b58c0 00:26:37.160 [2024-07-15 13:45:16.334038] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.160 [2024-07-15 13:45:16.335648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.160 [2024-07-15 13:45:16.335674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:37.160 [2024-07-15 13:45:16.335756] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:37.160 [2024-07-15 13:45:16.335782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:37.160 [2024-07-15 13:45:16.335879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:37.160 spare 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.160 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.160 [2024-07-15 13:45:16.436203] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19b7490 00:26:37.160 [2024-07-15 13:45:16.436218] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:37.160 [2024-07-15 13:45:16.436404] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19aff50 00:26:37.160 [2024-07-15 13:45:16.436543] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19b7490 00:26:37.160 [2024-07-15 13:45:16.436553] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19b7490 00:26:37.160 [2024-07-15 13:45:16.436650] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:37.419 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.419 "name": "raid_bdev1", 00:26:37.419 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:37.419 "strip_size_kb": 0, 00:26:37.419 "state": "online", 00:26:37.419 "raid_level": "raid1", 00:26:37.419 "superblock": true, 00:26:37.419 "num_base_bdevs": 2, 00:26:37.419 "num_base_bdevs_discovered": 2, 00:26:37.419 "num_base_bdevs_operational": 2, 00:26:37.419 "base_bdevs_list": [ 00:26:37.419 { 00:26:37.419 "name": "spare", 00:26:37.419 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:37.419 "is_configured": true, 00:26:37.419 "data_offset": 256, 00:26:37.419 "data_size": 7936 00:26:37.419 }, 00:26:37.419 { 00:26:37.419 "name": "BaseBdev2", 00:26:37.419 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:37.419 "is_configured": true, 00:26:37.419 "data_offset": 256, 00:26:37.419 "data_size": 7936 00:26:37.419 } 00:26:37.419 ] 00:26:37.419 }' 00:26:37.419 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.419 13:45:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.985 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.243 "name": "raid_bdev1", 00:26:38.243 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:38.243 "strip_size_kb": 0, 00:26:38.243 "state": "online", 00:26:38.243 "raid_level": "raid1", 00:26:38.243 "superblock": true, 00:26:38.243 "num_base_bdevs": 2, 00:26:38.243 "num_base_bdevs_discovered": 2, 00:26:38.243 "num_base_bdevs_operational": 2, 00:26:38.243 "base_bdevs_list": [ 00:26:38.243 { 00:26:38.243 "name": "spare", 00:26:38.243 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:38.243 "is_configured": true, 00:26:38.243 "data_offset": 256, 00:26:38.243 "data_size": 7936 00:26:38.243 }, 00:26:38.243 { 00:26:38.243 "name": "BaseBdev2", 00:26:38.243 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:38.243 "is_configured": true, 00:26:38.243 "data_offset": 256, 00:26:38.243 "data_size": 7936 00:26:38.243 } 00:26:38.243 ] 00:26:38.243 }' 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.243 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:38.502 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:38.502 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:38.760 [2024-07-15 13:45:17.970414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.760 13:45:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.018 13:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.018 "name": "raid_bdev1", 00:26:39.018 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:39.018 "strip_size_kb": 0, 00:26:39.018 "state": "online", 00:26:39.018 "raid_level": "raid1", 00:26:39.018 "superblock": true, 00:26:39.018 "num_base_bdevs": 2, 00:26:39.018 "num_base_bdevs_discovered": 1, 00:26:39.018 "num_base_bdevs_operational": 1, 00:26:39.018 "base_bdevs_list": [ 00:26:39.018 { 00:26:39.018 "name": null, 00:26:39.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.018 "is_configured": false, 00:26:39.018 "data_offset": 256, 00:26:39.018 "data_size": 7936 00:26:39.018 }, 00:26:39.018 { 00:26:39.018 "name": "BaseBdev2", 00:26:39.018 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:39.018 "is_configured": true, 00:26:39.018 "data_offset": 256, 00:26:39.018 "data_size": 7936 00:26:39.018 } 00:26:39.018 ] 00:26:39.018 }' 00:26:39.018 13:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.018 13:45:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:39.584 13:45:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:40.149 [2024-07-15 13:45:19.309995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:40.149 [2024-07-15 13:45:19.310141] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:40.149 [2024-07-15 13:45:19.310157] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:40.149 [2024-07-15 13:45:19.310184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:40.149 [2024-07-15 13:45:19.314975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19aff50 00:26:40.149 [2024-07-15 13:45:19.317269] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:40.149 13:45:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.155 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.412 "name": "raid_bdev1", 00:26:41.412 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:41.412 "strip_size_kb": 0, 00:26:41.412 "state": "online", 00:26:41.412 "raid_level": "raid1", 00:26:41.412 "superblock": true, 00:26:41.412 "num_base_bdevs": 2, 00:26:41.412 "num_base_bdevs_discovered": 2, 00:26:41.412 "num_base_bdevs_operational": 2, 00:26:41.412 "process": { 00:26:41.412 "type": "rebuild", 00:26:41.412 "target": "spare", 00:26:41.412 "progress": { 00:26:41.412 "blocks": 3072, 00:26:41.412 "percent": 38 00:26:41.412 } 00:26:41.412 }, 00:26:41.412 "base_bdevs_list": [ 00:26:41.412 { 00:26:41.412 "name": "spare", 00:26:41.412 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:41.412 "is_configured": true, 00:26:41.412 "data_offset": 256, 00:26:41.412 "data_size": 7936 00:26:41.412 }, 00:26:41.412 { 00:26:41.412 "name": "BaseBdev2", 00:26:41.412 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:41.412 "is_configured": true, 00:26:41.412 "data_offset": 256, 00:26:41.412 "data_size": 7936 00:26:41.412 } 00:26:41.412 ] 00:26:41.412 }' 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:41.412 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:41.670 [2024-07-15 13:45:20.915987] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:41.670 [2024-07-15 13:45:20.929906] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:41.670 [2024-07-15 13:45:20.929956] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:41.670 [2024-07-15 13:45:20.929972] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:41.670 [2024-07-15 13:45:20.929981] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.670 13:45:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.929 13:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:41.929 "name": "raid_bdev1", 00:26:41.929 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:41.929 "strip_size_kb": 0, 00:26:41.929 "state": "online", 00:26:41.929 "raid_level": "raid1", 00:26:41.929 "superblock": true, 00:26:41.929 "num_base_bdevs": 2, 00:26:41.929 "num_base_bdevs_discovered": 1, 00:26:41.929 "num_base_bdevs_operational": 1, 00:26:41.929 "base_bdevs_list": [ 00:26:41.929 { 00:26:41.929 "name": null, 00:26:41.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.929 "is_configured": false, 00:26:41.929 "data_offset": 256, 00:26:41.929 "data_size": 7936 00:26:41.929 }, 00:26:41.929 { 00:26:41.929 "name": "BaseBdev2", 00:26:41.929 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:41.929 "is_configured": true, 00:26:41.929 "data_offset": 256, 00:26:41.929 "data_size": 7936 00:26:41.929 } 00:26:41.929 ] 00:26:41.929 }' 00:26:41.929 13:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:41.929 13:45:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:42.495 13:45:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:42.753 [2024-07-15 13:45:22.017256] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:42.753 [2024-07-15 13:45:22.017304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.753 [2024-07-15 13:45:22.017327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b6700 00:26:42.753 [2024-07-15 13:45:22.017341] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.753 [2024-07-15 13:45:22.017704] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.753 [2024-07-15 13:45:22.017727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:42.753 [2024-07-15 13:45:22.017805] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:42.753 [2024-07-15 13:45:22.017817] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:42.753 [2024-07-15 13:45:22.017828] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:42.753 [2024-07-15 13:45:22.017848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:42.753 [2024-07-15 13:45:22.022702] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19aff50 00:26:42.753 spare 00:26:42.754 [2024-07-15 13:45:22.024158] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:42.754 13:45:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.686 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.945 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.945 "name": "raid_bdev1", 00:26:43.945 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:43.945 "strip_size_kb": 0, 00:26:43.945 "state": "online", 00:26:43.945 "raid_level": "raid1", 00:26:43.945 "superblock": true, 00:26:43.945 "num_base_bdevs": 2, 00:26:43.945 "num_base_bdevs_discovered": 2, 00:26:43.945 "num_base_bdevs_operational": 2, 00:26:43.945 "process": { 00:26:43.945 "type": "rebuild", 00:26:43.945 "target": "spare", 00:26:43.945 "progress": { 00:26:43.945 "blocks": 3072, 00:26:43.945 "percent": 38 00:26:43.945 } 00:26:43.945 }, 00:26:43.945 "base_bdevs_list": [ 00:26:43.945 { 00:26:43.945 "name": "spare", 00:26:43.945 "uuid": "b12c6363-54ab-5529-8bea-57b35377c0e3", 00:26:43.945 "is_configured": true, 00:26:43.945 "data_offset": 256, 00:26:43.945 "data_size": 7936 00:26:43.945 }, 00:26:43.945 { 00:26:43.945 "name": "BaseBdev2", 00:26:43.945 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:43.945 "is_configured": true, 00:26:43.945 "data_offset": 256, 00:26:43.945 "data_size": 7936 00:26:43.945 } 00:26:43.945 ] 00:26:43.945 }' 00:26:43.945 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.945 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.945 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.202 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.202 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:44.202 [2024-07-15 13:45:23.611230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:44.460 [2024-07-15 13:45:23.636579] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:44.460 [2024-07-15 13:45:23.636622] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:44.460 [2024-07-15 13:45:23.636638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:44.460 [2024-07-15 13:45:23.636647] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.460 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.717 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:44.717 "name": "raid_bdev1", 00:26:44.717 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:44.717 "strip_size_kb": 0, 00:26:44.717 "state": "online", 00:26:44.717 "raid_level": "raid1", 00:26:44.717 "superblock": true, 00:26:44.717 "num_base_bdevs": 2, 00:26:44.717 "num_base_bdevs_discovered": 1, 00:26:44.717 "num_base_bdevs_operational": 1, 00:26:44.717 "base_bdevs_list": [ 00:26:44.717 { 00:26:44.717 "name": null, 00:26:44.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.718 "is_configured": false, 00:26:44.718 "data_offset": 256, 00:26:44.718 "data_size": 7936 00:26:44.718 }, 00:26:44.718 { 00:26:44.718 "name": "BaseBdev2", 00:26:44.718 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:44.718 "is_configured": true, 00:26:44.718 "data_offset": 256, 00:26:44.718 "data_size": 7936 00:26:44.718 } 00:26:44.718 ] 00:26:44.718 }' 00:26:44.718 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:44.718 13:45:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.317 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.575 "name": "raid_bdev1", 00:26:45.575 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:45.575 "strip_size_kb": 0, 00:26:45.575 "state": "online", 00:26:45.575 "raid_level": "raid1", 00:26:45.575 "superblock": true, 00:26:45.575 "num_base_bdevs": 2, 00:26:45.575 "num_base_bdevs_discovered": 1, 00:26:45.575 "num_base_bdevs_operational": 1, 00:26:45.575 "base_bdevs_list": [ 00:26:45.575 { 00:26:45.575 "name": null, 00:26:45.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.575 "is_configured": false, 00:26:45.575 "data_offset": 256, 00:26:45.575 "data_size": 7936 00:26:45.575 }, 00:26:45.575 { 00:26:45.575 "name": "BaseBdev2", 00:26:45.575 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:45.575 "is_configured": true, 00:26:45.575 "data_offset": 256, 00:26:45.575 "data_size": 7936 00:26:45.575 } 00:26:45.575 ] 00:26:45.575 }' 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:45.575 13:45:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:46.141 13:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:46.141 [2024-07-15 13:45:25.562135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:46.141 [2024-07-15 13:45:25.562182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:46.141 [2024-07-15 13:45:25.562203] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19b8650 00:26:46.141 [2024-07-15 13:45:25.562216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:46.141 [2024-07-15 13:45:25.562545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:46.141 [2024-07-15 13:45:25.562563] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:46.141 [2024-07-15 13:45:25.562624] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:46.141 [2024-07-15 13:45:25.562636] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:46.141 [2024-07-15 13:45:25.562647] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:46.141 BaseBdev1 00:26:46.399 13:45:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.333 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.591 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:47.591 "name": "raid_bdev1", 00:26:47.591 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:47.591 "strip_size_kb": 0, 00:26:47.591 "state": "online", 00:26:47.591 "raid_level": "raid1", 00:26:47.591 "superblock": true, 00:26:47.591 "num_base_bdevs": 2, 00:26:47.591 "num_base_bdevs_discovered": 1, 00:26:47.591 "num_base_bdevs_operational": 1, 00:26:47.591 "base_bdevs_list": [ 00:26:47.591 { 00:26:47.591 "name": null, 00:26:47.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.591 "is_configured": false, 00:26:47.591 "data_offset": 256, 00:26:47.591 "data_size": 7936 00:26:47.591 }, 00:26:47.591 { 00:26:47.591 "name": "BaseBdev2", 00:26:47.591 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:47.591 "is_configured": true, 00:26:47.591 "data_offset": 256, 00:26:47.591 "data_size": 7936 00:26:47.591 } 00:26:47.591 ] 00:26:47.591 }' 00:26:47.591 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:47.591 13:45:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.158 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.415 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.415 "name": "raid_bdev1", 00:26:48.415 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:48.415 "strip_size_kb": 0, 00:26:48.415 "state": "online", 00:26:48.415 "raid_level": "raid1", 00:26:48.415 "superblock": true, 00:26:48.415 "num_base_bdevs": 2, 00:26:48.415 "num_base_bdevs_discovered": 1, 00:26:48.415 "num_base_bdevs_operational": 1, 00:26:48.415 "base_bdevs_list": [ 00:26:48.416 { 00:26:48.416 "name": null, 00:26:48.416 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.416 "is_configured": false, 00:26:48.416 "data_offset": 256, 00:26:48.416 "data_size": 7936 00:26:48.416 }, 00:26:48.416 { 00:26:48.416 "name": "BaseBdev2", 00:26:48.416 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:48.416 "is_configured": true, 00:26:48.416 "data_offset": 256, 00:26:48.416 "data_size": 7936 00:26:48.416 } 00:26:48.416 ] 00:26:48.416 }' 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:48.416 13:45:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.674 [2024-07-15 13:45:27.996614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:48.674 [2024-07-15 13:45:27.996737] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:48.674 [2024-07-15 13:45:27.996753] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:48.674 request: 00:26:48.674 { 00:26:48.674 "base_bdev": "BaseBdev1", 00:26:48.674 "raid_bdev": "raid_bdev1", 00:26:48.674 "method": "bdev_raid_add_base_bdev", 00:26:48.674 "req_id": 1 00:26:48.674 } 00:26:48.674 Got JSON-RPC error response 00:26:48.674 response: 00:26:48.674 { 00:26:48.674 "code": -22, 00:26:48.674 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:48.674 } 00:26:48.674 13:45:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:26:48.674 13:45:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:48.674 13:45:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:48.674 13:45:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:48.674 13:45:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.607 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.864 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:49.864 "name": "raid_bdev1", 00:26:49.864 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:49.864 "strip_size_kb": 0, 00:26:49.864 "state": "online", 00:26:49.864 "raid_level": "raid1", 00:26:49.864 "superblock": true, 00:26:49.864 "num_base_bdevs": 2, 00:26:49.864 "num_base_bdevs_discovered": 1, 00:26:49.864 "num_base_bdevs_operational": 1, 00:26:49.864 "base_bdevs_list": [ 00:26:49.864 { 00:26:49.864 "name": null, 00:26:49.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:49.864 "is_configured": false, 00:26:49.864 "data_offset": 256, 00:26:49.864 "data_size": 7936 00:26:49.864 }, 00:26:49.864 { 00:26:49.864 "name": "BaseBdev2", 00:26:49.864 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:49.864 "is_configured": true, 00:26:49.864 "data_offset": 256, 00:26:49.864 "data_size": 7936 00:26:49.864 } 00:26:49.864 ] 00:26:49.864 }' 00:26:49.864 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:49.864 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.796 13:45:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.796 "name": "raid_bdev1", 00:26:50.796 "uuid": "55b42e00-d524-4313-bb9f-c02198dcad13", 00:26:50.796 "strip_size_kb": 0, 00:26:50.796 "state": "online", 00:26:50.796 "raid_level": "raid1", 00:26:50.796 "superblock": true, 00:26:50.796 "num_base_bdevs": 2, 00:26:50.796 "num_base_bdevs_discovered": 1, 00:26:50.796 "num_base_bdevs_operational": 1, 00:26:50.796 "base_bdevs_list": [ 00:26:50.796 { 00:26:50.796 "name": null, 00:26:50.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.796 "is_configured": false, 00:26:50.796 "data_offset": 256, 00:26:50.796 "data_size": 7936 00:26:50.796 }, 00:26:50.796 { 00:26:50.796 "name": "BaseBdev2", 00:26:50.796 "uuid": "c936c83b-9d91-5f50-b586-ce23c4db15cd", 00:26:50.796 "is_configured": true, 00:26:50.796 "data_offset": 256, 00:26:50.796 "data_size": 7936 00:26:50.796 } 00:26:50.796 ] 00:26:50.796 }' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2209948 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2209948 ']' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2209948 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2209948 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2209948' 00:26:50.796 killing process with pid 2209948 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2209948 00:26:50.796 Received shutdown signal, test time was about 60.000000 seconds 00:26:50.796 00:26:50.796 Latency(us) 00:26:50.796 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:50.796 =================================================================================================================== 00:26:50.796 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:50.796 [2024-07-15 13:45:30.176685] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:50.796 [2024-07-15 13:45:30.176774] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:50.796 [2024-07-15 13:45:30.176821] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:50.796 [2024-07-15 13:45:30.176835] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19b7490 name raid_bdev1, state offline 00:26:50.796 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2209948 00:26:50.796 [2024-07-15 13:45:30.203807] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:51.054 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:26:51.054 00:26:51.054 real 0m31.706s 00:26:51.054 user 0m49.348s 00:26:51.054 sys 0m5.301s 00:26:51.054 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:51.054 13:45:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:51.054 ************************************ 00:26:51.054 END TEST raid_rebuild_test_sb_4k 00:26:51.054 ************************************ 00:26:51.054 13:45:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:51.054 13:45:30 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:26:51.054 13:45:30 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:26:51.054 13:45:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:51.054 13:45:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:51.054 13:45:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:51.312 ************************************ 00:26:51.312 START TEST raid_state_function_test_sb_md_separate 00:26:51.312 ************************************ 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2214951 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2214951' 00:26:51.312 Process raid pid: 2214951 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2214951 /var/tmp/spdk-raid.sock 00:26:51.312 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2214951 ']' 00:26:51.313 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:51.313 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:51.313 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:51.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:51.313 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:51.313 13:45:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:51.313 [2024-07-15 13:45:30.575795] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:26:51.313 [2024-07-15 13:45:30.575860] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:51.313 [2024-07-15 13:45:30.705138] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:51.571 [2024-07-15 13:45:30.809133] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:51.571 [2024-07-15 13:45:30.877789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:51.571 [2024-07-15 13:45:30.877822] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.133 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:52.133 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:26:52.133 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:52.391 [2024-07-15 13:45:31.584568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:52.391 [2024-07-15 13:45:31.584613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:52.391 [2024-07-15 13:45:31.584624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:52.391 [2024-07-15 13:45:31.584636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.391 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:52.648 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.648 "name": "Existed_Raid", 00:26:52.648 "uuid": "00a1e5b6-8827-4953-9770-6b3b3d2651dd", 00:26:52.648 "strip_size_kb": 0, 00:26:52.648 "state": "configuring", 00:26:52.648 "raid_level": "raid1", 00:26:52.648 "superblock": true, 00:26:52.648 "num_base_bdevs": 2, 00:26:52.648 "num_base_bdevs_discovered": 0, 00:26:52.648 "num_base_bdevs_operational": 2, 00:26:52.648 "base_bdevs_list": [ 00:26:52.648 { 00:26:52.648 "name": "BaseBdev1", 00:26:52.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.648 "is_configured": false, 00:26:52.648 "data_offset": 0, 00:26:52.648 "data_size": 0 00:26:52.648 }, 00:26:52.648 { 00:26:52.648 "name": "BaseBdev2", 00:26:52.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.648 "is_configured": false, 00:26:52.648 "data_offset": 0, 00:26:52.648 "data_size": 0 00:26:52.648 } 00:26:52.648 ] 00:26:52.648 }' 00:26:52.648 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.648 13:45:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:53.213 13:45:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:53.213 [2024-07-15 13:45:32.631190] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:53.213 [2024-07-15 13:45:32.631228] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd03a80 name Existed_Raid, state configuring 00:26:53.471 13:45:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:53.471 [2024-07-15 13:45:32.807685] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:53.471 [2024-07-15 13:45:32.807713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:53.471 [2024-07-15 13:45:32.807723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:53.471 [2024-07-15 13:45:32.807734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:53.471 13:45:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:26:53.729 [2024-07-15 13:45:32.990709] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:53.729 BaseBdev1 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:53.729 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:53.988 [ 00:26:53.988 { 00:26:53.988 "name": "BaseBdev1", 00:26:53.988 "aliases": [ 00:26:53.988 "7d503cc4-414f-4315-a4ae-90595ed78897" 00:26:53.988 ], 00:26:53.988 "product_name": "Malloc disk", 00:26:53.988 "block_size": 4096, 00:26:53.988 "num_blocks": 8192, 00:26:53.988 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:53.988 "md_size": 32, 00:26:53.988 "md_interleave": false, 00:26:53.988 "dif_type": 0, 00:26:53.988 "assigned_rate_limits": { 00:26:53.988 "rw_ios_per_sec": 0, 00:26:53.988 "rw_mbytes_per_sec": 0, 00:26:53.988 "r_mbytes_per_sec": 0, 00:26:53.988 "w_mbytes_per_sec": 0 00:26:53.988 }, 00:26:53.988 "claimed": true, 00:26:53.988 "claim_type": "exclusive_write", 00:26:53.988 "zoned": false, 00:26:53.988 "supported_io_types": { 00:26:53.988 "read": true, 00:26:53.988 "write": true, 00:26:53.988 "unmap": true, 00:26:53.988 "flush": true, 00:26:53.988 "reset": true, 00:26:53.988 "nvme_admin": false, 00:26:53.988 "nvme_io": false, 00:26:53.988 "nvme_io_md": false, 00:26:53.988 "write_zeroes": true, 00:26:53.988 "zcopy": true, 00:26:53.988 "get_zone_info": false, 00:26:53.988 "zone_management": false, 00:26:53.988 "zone_append": false, 00:26:53.988 "compare": false, 00:26:53.988 "compare_and_write": false, 00:26:53.988 "abort": true, 00:26:53.988 "seek_hole": false, 00:26:53.988 "seek_data": false, 00:26:53.988 "copy": true, 00:26:53.988 "nvme_iov_md": false 00:26:53.988 }, 00:26:53.988 "memory_domains": [ 00:26:53.988 { 00:26:53.988 "dma_device_id": "system", 00:26:53.988 "dma_device_type": 1 00:26:53.988 }, 00:26:53.988 { 00:26:53.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:53.988 "dma_device_type": 2 00:26:53.988 } 00:26:53.988 ], 00:26:53.988 "driver_specific": {} 00:26:53.988 } 00:26:53.988 ] 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:53.988 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.246 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.246 "name": "Existed_Raid", 00:26:54.246 "uuid": "35ed5604-bb02-4964-9587-0b154271e8aa", 00:26:54.246 "strip_size_kb": 0, 00:26:54.246 "state": "configuring", 00:26:54.246 "raid_level": "raid1", 00:26:54.246 "superblock": true, 00:26:54.246 "num_base_bdevs": 2, 00:26:54.246 "num_base_bdevs_discovered": 1, 00:26:54.246 "num_base_bdevs_operational": 2, 00:26:54.246 "base_bdevs_list": [ 00:26:54.246 { 00:26:54.246 "name": "BaseBdev1", 00:26:54.246 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:54.246 "is_configured": true, 00:26:54.246 "data_offset": 256, 00:26:54.246 "data_size": 7936 00:26:54.246 }, 00:26:54.246 { 00:26:54.246 "name": "BaseBdev2", 00:26:54.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.246 "is_configured": false, 00:26:54.246 "data_offset": 0, 00:26:54.246 "data_size": 0 00:26:54.246 } 00:26:54.246 ] 00:26:54.246 }' 00:26:54.246 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.246 13:45:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:54.810 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:55.068 [2024-07-15 13:45:34.334394] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:55.068 [2024-07-15 13:45:34.334437] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd03350 name Existed_Raid, state configuring 00:26:55.068 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:55.326 [2024-07-15 13:45:34.579083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:55.326 [2024-07-15 13:45:34.580509] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:55.326 [2024-07-15 13:45:34.580540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.326 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:55.591 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.591 "name": "Existed_Raid", 00:26:55.591 "uuid": "e182960e-c70e-49f4-a84e-0cb164ab99be", 00:26:55.591 "strip_size_kb": 0, 00:26:55.591 "state": "configuring", 00:26:55.591 "raid_level": "raid1", 00:26:55.591 "superblock": true, 00:26:55.591 "num_base_bdevs": 2, 00:26:55.591 "num_base_bdevs_discovered": 1, 00:26:55.591 "num_base_bdevs_operational": 2, 00:26:55.591 "base_bdevs_list": [ 00:26:55.591 { 00:26:55.591 "name": "BaseBdev1", 00:26:55.591 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:55.591 "is_configured": true, 00:26:55.591 "data_offset": 256, 00:26:55.591 "data_size": 7936 00:26:55.591 }, 00:26:55.591 { 00:26:55.591 "name": "BaseBdev2", 00:26:55.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.591 "is_configured": false, 00:26:55.591 "data_offset": 0, 00:26:55.591 "data_size": 0 00:26:55.591 } 00:26:55.591 ] 00:26:55.591 }' 00:26:55.591 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.591 13:45:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:56.200 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:26:56.459 [2024-07-15 13:45:35.694232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:56.459 [2024-07-15 13:45:35.694388] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd05210 00:26:56.459 [2024-07-15 13:45:35.694401] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:56.459 [2024-07-15 13:45:35.694466] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd04c50 00:26:56.459 [2024-07-15 13:45:35.694563] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd05210 00:26:56.459 [2024-07-15 13:45:35.694574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd05210 00:26:56.459 [2024-07-15 13:45:35.694643] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.459 BaseBdev2 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:56.459 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:56.716 13:45:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:56.973 [ 00:26:56.973 { 00:26:56.973 "name": "BaseBdev2", 00:26:56.973 "aliases": [ 00:26:56.973 "82202472-6c80-444e-9f76-3e4da191814f" 00:26:56.973 ], 00:26:56.973 "product_name": "Malloc disk", 00:26:56.973 "block_size": 4096, 00:26:56.973 "num_blocks": 8192, 00:26:56.973 "uuid": "82202472-6c80-444e-9f76-3e4da191814f", 00:26:56.973 "md_size": 32, 00:26:56.973 "md_interleave": false, 00:26:56.973 "dif_type": 0, 00:26:56.973 "assigned_rate_limits": { 00:26:56.973 "rw_ios_per_sec": 0, 00:26:56.973 "rw_mbytes_per_sec": 0, 00:26:56.974 "r_mbytes_per_sec": 0, 00:26:56.974 "w_mbytes_per_sec": 0 00:26:56.974 }, 00:26:56.974 "claimed": true, 00:26:56.974 "claim_type": "exclusive_write", 00:26:56.974 "zoned": false, 00:26:56.974 "supported_io_types": { 00:26:56.974 "read": true, 00:26:56.974 "write": true, 00:26:56.974 "unmap": true, 00:26:56.974 "flush": true, 00:26:56.974 "reset": true, 00:26:56.974 "nvme_admin": false, 00:26:56.974 "nvme_io": false, 00:26:56.974 "nvme_io_md": false, 00:26:56.974 "write_zeroes": true, 00:26:56.974 "zcopy": true, 00:26:56.974 "get_zone_info": false, 00:26:56.974 "zone_management": false, 00:26:56.974 "zone_append": false, 00:26:56.974 "compare": false, 00:26:56.974 "compare_and_write": false, 00:26:56.974 "abort": true, 00:26:56.974 "seek_hole": false, 00:26:56.974 "seek_data": false, 00:26:56.974 "copy": true, 00:26:56.974 "nvme_iov_md": false 00:26:56.974 }, 00:26:56.974 "memory_domains": [ 00:26:56.974 { 00:26:56.974 "dma_device_id": "system", 00:26:56.974 "dma_device_type": 1 00:26:56.974 }, 00:26:56.974 { 00:26:56.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.974 "dma_device_type": 2 00:26:56.974 } 00:26:56.974 ], 00:26:56.974 "driver_specific": {} 00:26:56.974 } 00:26:56.974 ] 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.974 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:57.231 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.231 "name": "Existed_Raid", 00:26:57.231 "uuid": "e182960e-c70e-49f4-a84e-0cb164ab99be", 00:26:57.231 "strip_size_kb": 0, 00:26:57.231 "state": "online", 00:26:57.231 "raid_level": "raid1", 00:26:57.231 "superblock": true, 00:26:57.231 "num_base_bdevs": 2, 00:26:57.231 "num_base_bdevs_discovered": 2, 00:26:57.231 "num_base_bdevs_operational": 2, 00:26:57.231 "base_bdevs_list": [ 00:26:57.231 { 00:26:57.231 "name": "BaseBdev1", 00:26:57.231 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:57.231 "is_configured": true, 00:26:57.231 "data_offset": 256, 00:26:57.231 "data_size": 7936 00:26:57.231 }, 00:26:57.231 { 00:26:57.231 "name": "BaseBdev2", 00:26:57.231 "uuid": "82202472-6c80-444e-9f76-3e4da191814f", 00:26:57.231 "is_configured": true, 00:26:57.231 "data_offset": 256, 00:26:57.231 "data_size": 7936 00:26:57.231 } 00:26:57.231 ] 00:26:57.231 }' 00:26:57.231 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.231 13:45:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:57.844 [2024-07-15 13:45:37.158391] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:57.844 "name": "Existed_Raid", 00:26:57.844 "aliases": [ 00:26:57.844 "e182960e-c70e-49f4-a84e-0cb164ab99be" 00:26:57.844 ], 00:26:57.844 "product_name": "Raid Volume", 00:26:57.844 "block_size": 4096, 00:26:57.844 "num_blocks": 7936, 00:26:57.844 "uuid": "e182960e-c70e-49f4-a84e-0cb164ab99be", 00:26:57.844 "md_size": 32, 00:26:57.844 "md_interleave": false, 00:26:57.844 "dif_type": 0, 00:26:57.844 "assigned_rate_limits": { 00:26:57.844 "rw_ios_per_sec": 0, 00:26:57.844 "rw_mbytes_per_sec": 0, 00:26:57.844 "r_mbytes_per_sec": 0, 00:26:57.844 "w_mbytes_per_sec": 0 00:26:57.844 }, 00:26:57.844 "claimed": false, 00:26:57.844 "zoned": false, 00:26:57.844 "supported_io_types": { 00:26:57.844 "read": true, 00:26:57.844 "write": true, 00:26:57.844 "unmap": false, 00:26:57.844 "flush": false, 00:26:57.844 "reset": true, 00:26:57.844 "nvme_admin": false, 00:26:57.844 "nvme_io": false, 00:26:57.844 "nvme_io_md": false, 00:26:57.844 "write_zeroes": true, 00:26:57.844 "zcopy": false, 00:26:57.844 "get_zone_info": false, 00:26:57.844 "zone_management": false, 00:26:57.844 "zone_append": false, 00:26:57.844 "compare": false, 00:26:57.844 "compare_and_write": false, 00:26:57.844 "abort": false, 00:26:57.844 "seek_hole": false, 00:26:57.844 "seek_data": false, 00:26:57.844 "copy": false, 00:26:57.844 "nvme_iov_md": false 00:26:57.844 }, 00:26:57.844 "memory_domains": [ 00:26:57.844 { 00:26:57.844 "dma_device_id": "system", 00:26:57.844 "dma_device_type": 1 00:26:57.844 }, 00:26:57.844 { 00:26:57.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.844 "dma_device_type": 2 00:26:57.844 }, 00:26:57.844 { 00:26:57.844 "dma_device_id": "system", 00:26:57.844 "dma_device_type": 1 00:26:57.844 }, 00:26:57.844 { 00:26:57.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.844 "dma_device_type": 2 00:26:57.844 } 00:26:57.844 ], 00:26:57.844 "driver_specific": { 00:26:57.844 "raid": { 00:26:57.844 "uuid": "e182960e-c70e-49f4-a84e-0cb164ab99be", 00:26:57.844 "strip_size_kb": 0, 00:26:57.844 "state": "online", 00:26:57.844 "raid_level": "raid1", 00:26:57.844 "superblock": true, 00:26:57.844 "num_base_bdevs": 2, 00:26:57.844 "num_base_bdevs_discovered": 2, 00:26:57.844 "num_base_bdevs_operational": 2, 00:26:57.844 "base_bdevs_list": [ 00:26:57.844 { 00:26:57.844 "name": "BaseBdev1", 00:26:57.844 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:57.844 "is_configured": true, 00:26:57.844 "data_offset": 256, 00:26:57.844 "data_size": 7936 00:26:57.844 }, 00:26:57.844 { 00:26:57.844 "name": "BaseBdev2", 00:26:57.844 "uuid": "82202472-6c80-444e-9f76-3e4da191814f", 00:26:57.844 "is_configured": true, 00:26:57.844 "data_offset": 256, 00:26:57.844 "data_size": 7936 00:26:57.844 } 00:26:57.844 ] 00:26:57.844 } 00:26:57.844 } 00:26:57.844 }' 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:57.844 BaseBdev2' 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:57.844 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:58.408 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:58.408 "name": "BaseBdev1", 00:26:58.408 "aliases": [ 00:26:58.408 "7d503cc4-414f-4315-a4ae-90595ed78897" 00:26:58.408 ], 00:26:58.408 "product_name": "Malloc disk", 00:26:58.408 "block_size": 4096, 00:26:58.408 "num_blocks": 8192, 00:26:58.408 "uuid": "7d503cc4-414f-4315-a4ae-90595ed78897", 00:26:58.408 "md_size": 32, 00:26:58.408 "md_interleave": false, 00:26:58.408 "dif_type": 0, 00:26:58.408 "assigned_rate_limits": { 00:26:58.408 "rw_ios_per_sec": 0, 00:26:58.408 "rw_mbytes_per_sec": 0, 00:26:58.408 "r_mbytes_per_sec": 0, 00:26:58.408 "w_mbytes_per_sec": 0 00:26:58.408 }, 00:26:58.408 "claimed": true, 00:26:58.408 "claim_type": "exclusive_write", 00:26:58.408 "zoned": false, 00:26:58.408 "supported_io_types": { 00:26:58.408 "read": true, 00:26:58.408 "write": true, 00:26:58.408 "unmap": true, 00:26:58.408 "flush": true, 00:26:58.408 "reset": true, 00:26:58.408 "nvme_admin": false, 00:26:58.408 "nvme_io": false, 00:26:58.408 "nvme_io_md": false, 00:26:58.408 "write_zeroes": true, 00:26:58.408 "zcopy": true, 00:26:58.408 "get_zone_info": false, 00:26:58.408 "zone_management": false, 00:26:58.408 "zone_append": false, 00:26:58.408 "compare": false, 00:26:58.408 "compare_and_write": false, 00:26:58.408 "abort": true, 00:26:58.408 "seek_hole": false, 00:26:58.408 "seek_data": false, 00:26:58.408 "copy": true, 00:26:58.408 "nvme_iov_md": false 00:26:58.408 }, 00:26:58.408 "memory_domains": [ 00:26:58.408 { 00:26:58.408 "dma_device_id": "system", 00:26:58.408 "dma_device_type": 1 00:26:58.408 }, 00:26:58.408 { 00:26:58.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.408 "dma_device_type": 2 00:26:58.408 } 00:26:58.408 ], 00:26:58.408 "driver_specific": {} 00:26:58.408 }' 00:26:58.408 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.408 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.664 13:45:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.664 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:58.664 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.664 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.921 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:58.921 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:58.921 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:58.921 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:59.177 "name": "BaseBdev2", 00:26:59.177 "aliases": [ 00:26:59.177 "82202472-6c80-444e-9f76-3e4da191814f" 00:26:59.177 ], 00:26:59.177 "product_name": "Malloc disk", 00:26:59.177 "block_size": 4096, 00:26:59.177 "num_blocks": 8192, 00:26:59.177 "uuid": "82202472-6c80-444e-9f76-3e4da191814f", 00:26:59.177 "md_size": 32, 00:26:59.177 "md_interleave": false, 00:26:59.177 "dif_type": 0, 00:26:59.177 "assigned_rate_limits": { 00:26:59.177 "rw_ios_per_sec": 0, 00:26:59.177 "rw_mbytes_per_sec": 0, 00:26:59.177 "r_mbytes_per_sec": 0, 00:26:59.177 "w_mbytes_per_sec": 0 00:26:59.177 }, 00:26:59.177 "claimed": true, 00:26:59.177 "claim_type": "exclusive_write", 00:26:59.177 "zoned": false, 00:26:59.177 "supported_io_types": { 00:26:59.177 "read": true, 00:26:59.177 "write": true, 00:26:59.177 "unmap": true, 00:26:59.177 "flush": true, 00:26:59.177 "reset": true, 00:26:59.177 "nvme_admin": false, 00:26:59.177 "nvme_io": false, 00:26:59.177 "nvme_io_md": false, 00:26:59.177 "write_zeroes": true, 00:26:59.177 "zcopy": true, 00:26:59.177 "get_zone_info": false, 00:26:59.177 "zone_management": false, 00:26:59.177 "zone_append": false, 00:26:59.177 "compare": false, 00:26:59.177 "compare_and_write": false, 00:26:59.177 "abort": true, 00:26:59.177 "seek_hole": false, 00:26:59.177 "seek_data": false, 00:26:59.177 "copy": true, 00:26:59.177 "nvme_iov_md": false 00:26:59.177 }, 00:26:59.177 "memory_domains": [ 00:26:59.177 { 00:26:59.177 "dma_device_id": "system", 00:26:59.177 "dma_device_type": 1 00:26:59.177 }, 00:26:59.177 { 00:26:59.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:59.177 "dma_device_type": 2 00:26:59.177 } 00:26:59.177 ], 00:26:59.177 "driver_specific": {} 00:26:59.177 }' 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.177 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.434 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:59.434 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.434 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.434 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:59.434 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:59.692 [2024-07-15 13:45:38.930888] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.692 13:45:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:59.949 13:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:59.949 "name": "Existed_Raid", 00:26:59.949 "uuid": "e182960e-c70e-49f4-a84e-0cb164ab99be", 00:26:59.949 "strip_size_kb": 0, 00:26:59.949 "state": "online", 00:26:59.949 "raid_level": "raid1", 00:26:59.949 "superblock": true, 00:26:59.949 "num_base_bdevs": 2, 00:26:59.949 "num_base_bdevs_discovered": 1, 00:26:59.949 "num_base_bdevs_operational": 1, 00:26:59.949 "base_bdevs_list": [ 00:26:59.949 { 00:26:59.949 "name": null, 00:26:59.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.949 "is_configured": false, 00:26:59.949 "data_offset": 256, 00:26:59.949 "data_size": 7936 00:26:59.949 }, 00:26:59.949 { 00:26:59.949 "name": "BaseBdev2", 00:26:59.949 "uuid": "82202472-6c80-444e-9f76-3e4da191814f", 00:26:59.949 "is_configured": true, 00:26:59.949 "data_offset": 256, 00:26:59.949 "data_size": 7936 00:26:59.949 } 00:26:59.949 ] 00:26:59.949 }' 00:26:59.949 13:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:59.949 13:45:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:00.880 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:01.137 [2024-07-15 13:45:40.511715] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:01.137 [2024-07-15 13:45:40.511814] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:01.137 [2024-07-15 13:45:40.525322] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:01.137 [2024-07-15 13:45:40.525361] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:01.137 [2024-07-15 13:45:40.525374] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd05210 name Existed_Raid, state offline 00:27:01.137 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:01.137 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:01.137 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.137 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2214951 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2214951 ']' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2214951 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2214951 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2214951' 00:27:01.395 killing process with pid 2214951 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2214951 00:27:01.395 [2024-07-15 13:45:40.763558] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:01.395 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2214951 00:27:01.395 [2024-07-15 13:45:40.764562] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:01.652 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:27:01.652 00:27:01.652 real 0m10.485s 00:27:01.652 user 0m18.564s 00:27:01.652 sys 0m1.992s 00:27:01.653 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:01.653 13:45:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:01.653 ************************************ 00:27:01.653 END TEST raid_state_function_test_sb_md_separate 00:27:01.653 ************************************ 00:27:01.653 13:45:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:01.653 13:45:41 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:27:01.653 13:45:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:01.653 13:45:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:01.653 13:45:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:01.653 ************************************ 00:27:01.653 START TEST raid_superblock_test_md_separate 00:27:01.653 ************************************ 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2216574 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2216574 /var/tmp/spdk-raid.sock 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2216574 ']' 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:01.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:01.653 13:45:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:01.942 [2024-07-15 13:45:41.122978] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:27:01.942 [2024-07-15 13:45:41.123046] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2216574 ] 00:27:01.942 [2024-07-15 13:45:41.244719] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:01.942 [2024-07-15 13:45:41.347608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:02.199 [2024-07-15 13:45:41.415003] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.199 [2024-07-15 13:45:41.415039] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:02.765 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:27:03.023 malloc1 00:27:03.023 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:03.590 [2024-07-15 13:45:42.772508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:03.590 [2024-07-15 13:45:42.772558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.590 [2024-07-15 13:45:42.772580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f9830 00:27:03.590 [2024-07-15 13:45:42.772592] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.590 [2024-07-15 13:45:42.774191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.590 [2024-07-15 13:45:42.774218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:03.590 pt1 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:03.590 13:45:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:27:03.848 malloc2 00:27:03.848 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:03.848 [2024-07-15 13:45:43.272736] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:03.848 [2024-07-15 13:45:43.272785] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.848 [2024-07-15 13:45:43.272805] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24eb250 00:27:03.848 [2024-07-15 13:45:43.272818] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.105 [2024-07-15 13:45:43.274293] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.105 [2024-07-15 13:45:43.274322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:04.106 pt2 00:27:04.106 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:04.106 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:04.106 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:04.377 [2024-07-15 13:45:43.774083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:04.377 [2024-07-15 13:45:43.775748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:04.377 [2024-07-15 13:45:43.775915] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24ebd20 00:27:04.377 [2024-07-15 13:45:43.775936] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:04.377 [2024-07-15 13:45:43.776028] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24dfa60 00:27:04.377 [2024-07-15 13:45:43.776164] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24ebd20 00:27:04.377 [2024-07-15 13:45:43.776174] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24ebd20 00:27:04.377 [2024-07-15 13:45:43.776262] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.633 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.634 13:45:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.890 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.890 "name": "raid_bdev1", 00:27:04.890 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:04.890 "strip_size_kb": 0, 00:27:04.890 "state": "online", 00:27:04.890 "raid_level": "raid1", 00:27:04.890 "superblock": true, 00:27:04.890 "num_base_bdevs": 2, 00:27:04.890 "num_base_bdevs_discovered": 2, 00:27:04.890 "num_base_bdevs_operational": 2, 00:27:04.890 "base_bdevs_list": [ 00:27:04.890 { 00:27:04.890 "name": "pt1", 00:27:04.890 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:04.890 "is_configured": true, 00:27:04.890 "data_offset": 256, 00:27:04.890 "data_size": 7936 00:27:04.890 }, 00:27:04.890 { 00:27:04.890 "name": "pt2", 00:27:04.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:04.890 "is_configured": true, 00:27:04.890 "data_offset": 256, 00:27:04.890 "data_size": 7936 00:27:04.890 } 00:27:04.890 ] 00:27:04.890 }' 00:27:04.890 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.890 13:45:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:05.456 13:45:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:06.019 [2024-07-15 13:45:45.145959] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:06.019 "name": "raid_bdev1", 00:27:06.019 "aliases": [ 00:27:06.019 "30cd1936-54e7-4962-a517-61cde7f78077" 00:27:06.019 ], 00:27:06.019 "product_name": "Raid Volume", 00:27:06.019 "block_size": 4096, 00:27:06.019 "num_blocks": 7936, 00:27:06.019 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:06.019 "md_size": 32, 00:27:06.019 "md_interleave": false, 00:27:06.019 "dif_type": 0, 00:27:06.019 "assigned_rate_limits": { 00:27:06.019 "rw_ios_per_sec": 0, 00:27:06.019 "rw_mbytes_per_sec": 0, 00:27:06.019 "r_mbytes_per_sec": 0, 00:27:06.019 "w_mbytes_per_sec": 0 00:27:06.019 }, 00:27:06.019 "claimed": false, 00:27:06.019 "zoned": false, 00:27:06.019 "supported_io_types": { 00:27:06.019 "read": true, 00:27:06.019 "write": true, 00:27:06.019 "unmap": false, 00:27:06.019 "flush": false, 00:27:06.019 "reset": true, 00:27:06.019 "nvme_admin": false, 00:27:06.019 "nvme_io": false, 00:27:06.019 "nvme_io_md": false, 00:27:06.019 "write_zeroes": true, 00:27:06.019 "zcopy": false, 00:27:06.019 "get_zone_info": false, 00:27:06.019 "zone_management": false, 00:27:06.019 "zone_append": false, 00:27:06.019 "compare": false, 00:27:06.019 "compare_and_write": false, 00:27:06.019 "abort": false, 00:27:06.019 "seek_hole": false, 00:27:06.019 "seek_data": false, 00:27:06.019 "copy": false, 00:27:06.019 "nvme_iov_md": false 00:27:06.019 }, 00:27:06.019 "memory_domains": [ 00:27:06.019 { 00:27:06.019 "dma_device_id": "system", 00:27:06.019 "dma_device_type": 1 00:27:06.019 }, 00:27:06.019 { 00:27:06.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.019 "dma_device_type": 2 00:27:06.019 }, 00:27:06.019 { 00:27:06.019 "dma_device_id": "system", 00:27:06.019 "dma_device_type": 1 00:27:06.019 }, 00:27:06.019 { 00:27:06.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.019 "dma_device_type": 2 00:27:06.019 } 00:27:06.019 ], 00:27:06.019 "driver_specific": { 00:27:06.019 "raid": { 00:27:06.019 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:06.019 "strip_size_kb": 0, 00:27:06.019 "state": "online", 00:27:06.019 "raid_level": "raid1", 00:27:06.019 "superblock": true, 00:27:06.019 "num_base_bdevs": 2, 00:27:06.019 "num_base_bdevs_discovered": 2, 00:27:06.019 "num_base_bdevs_operational": 2, 00:27:06.019 "base_bdevs_list": [ 00:27:06.019 { 00:27:06.019 "name": "pt1", 00:27:06.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:06.019 "is_configured": true, 00:27:06.019 "data_offset": 256, 00:27:06.019 "data_size": 7936 00:27:06.019 }, 00:27:06.019 { 00:27:06.019 "name": "pt2", 00:27:06.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:06.019 "is_configured": true, 00:27:06.019 "data_offset": 256, 00:27:06.019 "data_size": 7936 00:27:06.019 } 00:27:06.019 ] 00:27:06.019 } 00:27:06.019 } 00:27:06.019 }' 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:06.019 pt2' 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:06.019 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:06.275 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:06.276 "name": "pt1", 00:27:06.276 "aliases": [ 00:27:06.276 "00000000-0000-0000-0000-000000000001" 00:27:06.276 ], 00:27:06.276 "product_name": "passthru", 00:27:06.276 "block_size": 4096, 00:27:06.276 "num_blocks": 8192, 00:27:06.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:06.276 "md_size": 32, 00:27:06.276 "md_interleave": false, 00:27:06.276 "dif_type": 0, 00:27:06.276 "assigned_rate_limits": { 00:27:06.276 "rw_ios_per_sec": 0, 00:27:06.276 "rw_mbytes_per_sec": 0, 00:27:06.276 "r_mbytes_per_sec": 0, 00:27:06.276 "w_mbytes_per_sec": 0 00:27:06.276 }, 00:27:06.276 "claimed": true, 00:27:06.276 "claim_type": "exclusive_write", 00:27:06.276 "zoned": false, 00:27:06.276 "supported_io_types": { 00:27:06.276 "read": true, 00:27:06.276 "write": true, 00:27:06.276 "unmap": true, 00:27:06.276 "flush": true, 00:27:06.276 "reset": true, 00:27:06.276 "nvme_admin": false, 00:27:06.276 "nvme_io": false, 00:27:06.276 "nvme_io_md": false, 00:27:06.276 "write_zeroes": true, 00:27:06.276 "zcopy": true, 00:27:06.276 "get_zone_info": false, 00:27:06.276 "zone_management": false, 00:27:06.276 "zone_append": false, 00:27:06.276 "compare": false, 00:27:06.276 "compare_and_write": false, 00:27:06.276 "abort": true, 00:27:06.276 "seek_hole": false, 00:27:06.276 "seek_data": false, 00:27:06.276 "copy": true, 00:27:06.276 "nvme_iov_md": false 00:27:06.276 }, 00:27:06.276 "memory_domains": [ 00:27:06.276 { 00:27:06.276 "dma_device_id": "system", 00:27:06.276 "dma_device_type": 1 00:27:06.276 }, 00:27:06.276 { 00:27:06.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.276 "dma_device_type": 2 00:27:06.276 } 00:27:06.276 ], 00:27:06.276 "driver_specific": { 00:27:06.276 "passthru": { 00:27:06.276 "name": "pt1", 00:27:06.276 "base_bdev_name": "malloc1" 00:27:06.276 } 00:27:06.276 } 00:27:06.276 }' 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.276 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:06.533 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:06.534 13:45:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:06.792 "name": "pt2", 00:27:06.792 "aliases": [ 00:27:06.792 "00000000-0000-0000-0000-000000000002" 00:27:06.792 ], 00:27:06.792 "product_name": "passthru", 00:27:06.792 "block_size": 4096, 00:27:06.792 "num_blocks": 8192, 00:27:06.792 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:06.792 "md_size": 32, 00:27:06.792 "md_interleave": false, 00:27:06.792 "dif_type": 0, 00:27:06.792 "assigned_rate_limits": { 00:27:06.792 "rw_ios_per_sec": 0, 00:27:06.792 "rw_mbytes_per_sec": 0, 00:27:06.792 "r_mbytes_per_sec": 0, 00:27:06.792 "w_mbytes_per_sec": 0 00:27:06.792 }, 00:27:06.792 "claimed": true, 00:27:06.792 "claim_type": "exclusive_write", 00:27:06.792 "zoned": false, 00:27:06.792 "supported_io_types": { 00:27:06.792 "read": true, 00:27:06.792 "write": true, 00:27:06.792 "unmap": true, 00:27:06.792 "flush": true, 00:27:06.792 "reset": true, 00:27:06.792 "nvme_admin": false, 00:27:06.792 "nvme_io": false, 00:27:06.792 "nvme_io_md": false, 00:27:06.792 "write_zeroes": true, 00:27:06.792 "zcopy": true, 00:27:06.792 "get_zone_info": false, 00:27:06.792 "zone_management": false, 00:27:06.792 "zone_append": false, 00:27:06.792 "compare": false, 00:27:06.792 "compare_and_write": false, 00:27:06.792 "abort": true, 00:27:06.792 "seek_hole": false, 00:27:06.792 "seek_data": false, 00:27:06.792 "copy": true, 00:27:06.792 "nvme_iov_md": false 00:27:06.792 }, 00:27:06.792 "memory_domains": [ 00:27:06.792 { 00:27:06.792 "dma_device_id": "system", 00:27:06.792 "dma_device_type": 1 00:27:06.792 }, 00:27:06.792 { 00:27:06.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.792 "dma_device_type": 2 00:27:06.792 } 00:27:06.792 ], 00:27:06.792 "driver_specific": { 00:27:06.792 "passthru": { 00:27:06.792 "name": "pt2", 00:27:06.792 "base_bdev_name": "malloc2" 00:27:06.792 } 00:27:06.792 } 00:27:06.792 }' 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.792 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:07.048 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:07.323 [2024-07-15 13:45:46.637877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:07.323 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=30cd1936-54e7-4962-a517-61cde7f78077 00:27:07.323 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 30cd1936-54e7-4962-a517-61cde7f78077 ']' 00:27:07.323 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:07.581 [2024-07-15 13:45:46.894300] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:07.581 [2024-07-15 13:45:46.894321] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:07.581 [2024-07-15 13:45:46.894382] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:07.581 [2024-07-15 13:45:46.894440] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:07.581 [2024-07-15 13:45:46.894452] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24ebd20 name raid_bdev1, state offline 00:27:07.581 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.581 13:45:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:07.837 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:07.837 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:07.837 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.837 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:08.094 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:08.094 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:08.351 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:08.351 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:08.608 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:08.609 13:45:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:08.868 [2024-07-15 13:45:48.105459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:08.868 [2024-07-15 13:45:48.107172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:08.868 [2024-07-15 13:45:48.107239] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:08.868 [2024-07-15 13:45:48.107285] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:08.868 [2024-07-15 13:45:48.107304] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:08.868 [2024-07-15 13:45:48.107314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x235bed0 name raid_bdev1, state configuring 00:27:08.868 request: 00:27:08.868 { 00:27:08.868 "name": "raid_bdev1", 00:27:08.868 "raid_level": "raid1", 00:27:08.868 "base_bdevs": [ 00:27:08.868 "malloc1", 00:27:08.868 "malloc2" 00:27:08.868 ], 00:27:08.868 "superblock": false, 00:27:08.868 "method": "bdev_raid_create", 00:27:08.868 "req_id": 1 00:27:08.868 } 00:27:08.868 Got JSON-RPC error response 00:27:08.868 response: 00:27:08.868 { 00:27:08.868 "code": -17, 00:27:08.868 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:08.868 } 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.868 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:09.125 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:09.125 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:09.125 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:09.383 [2024-07-15 13:45:48.614747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:09.383 [2024-07-15 13:45:48.614804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:09.383 [2024-07-15 13:45:48.614823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f9ee0 00:27:09.383 [2024-07-15 13:45:48.614835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:09.383 [2024-07-15 13:45:48.616603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:09.383 [2024-07-15 13:45:48.616636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:09.383 [2024-07-15 13:45:48.616690] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:09.383 [2024-07-15 13:45:48.616718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:09.383 pt1 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.383 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.384 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.384 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.683 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.683 "name": "raid_bdev1", 00:27:09.683 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:09.683 "strip_size_kb": 0, 00:27:09.683 "state": "configuring", 00:27:09.683 "raid_level": "raid1", 00:27:09.683 "superblock": true, 00:27:09.683 "num_base_bdevs": 2, 00:27:09.683 "num_base_bdevs_discovered": 1, 00:27:09.683 "num_base_bdevs_operational": 2, 00:27:09.683 "base_bdevs_list": [ 00:27:09.683 { 00:27:09.683 "name": "pt1", 00:27:09.683 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:09.683 "is_configured": true, 00:27:09.683 "data_offset": 256, 00:27:09.683 "data_size": 7936 00:27:09.683 }, 00:27:09.683 { 00:27:09.683 "name": null, 00:27:09.683 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:09.683 "is_configured": false, 00:27:09.683 "data_offset": 256, 00:27:09.683 "data_size": 7936 00:27:09.683 } 00:27:09.683 ] 00:27:09.683 }' 00:27:09.683 13:45:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.683 13:45:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:10.306 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:10.306 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:10.306 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:10.306 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:10.306 [2024-07-15 13:45:49.713791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:10.306 [2024-07-15 13:45:49.713842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:10.306 [2024-07-15 13:45:49.713861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235c490 00:27:10.306 [2024-07-15 13:45:49.713874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:10.306 [2024-07-15 13:45:49.714101] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:10.306 [2024-07-15 13:45:49.714121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:10.306 [2024-07-15 13:45:49.714170] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:10.306 [2024-07-15 13:45:49.714191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:10.306 [2024-07-15 13:45:49.714289] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e05d0 00:27:10.306 [2024-07-15 13:45:49.714301] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:10.306 [2024-07-15 13:45:49.714363] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e1800 00:27:10.306 [2024-07-15 13:45:49.714471] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e05d0 00:27:10.306 [2024-07-15 13:45:49.714482] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24e05d0 00:27:10.306 [2024-07-15 13:45:49.714556] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:10.306 pt2 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.563 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.563 "name": "raid_bdev1", 00:27:10.563 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:10.563 "strip_size_kb": 0, 00:27:10.564 "state": "online", 00:27:10.564 "raid_level": "raid1", 00:27:10.564 "superblock": true, 00:27:10.564 "num_base_bdevs": 2, 00:27:10.564 "num_base_bdevs_discovered": 2, 00:27:10.564 "num_base_bdevs_operational": 2, 00:27:10.564 "base_bdevs_list": [ 00:27:10.564 { 00:27:10.564 "name": "pt1", 00:27:10.564 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:10.564 "is_configured": true, 00:27:10.564 "data_offset": 256, 00:27:10.564 "data_size": 7936 00:27:10.564 }, 00:27:10.564 { 00:27:10.564 "name": "pt2", 00:27:10.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:10.564 "is_configured": true, 00:27:10.564 "data_offset": 256, 00:27:10.564 "data_size": 7936 00:27:10.564 } 00:27:10.564 ] 00:27:10.564 }' 00:27:10.564 13:45:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.564 13:45:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:11.128 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:11.384 [2024-07-15 13:45:50.748809] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:11.384 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:11.384 "name": "raid_bdev1", 00:27:11.384 "aliases": [ 00:27:11.384 "30cd1936-54e7-4962-a517-61cde7f78077" 00:27:11.384 ], 00:27:11.384 "product_name": "Raid Volume", 00:27:11.384 "block_size": 4096, 00:27:11.384 "num_blocks": 7936, 00:27:11.384 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:11.384 "md_size": 32, 00:27:11.384 "md_interleave": false, 00:27:11.384 "dif_type": 0, 00:27:11.384 "assigned_rate_limits": { 00:27:11.384 "rw_ios_per_sec": 0, 00:27:11.384 "rw_mbytes_per_sec": 0, 00:27:11.384 "r_mbytes_per_sec": 0, 00:27:11.384 "w_mbytes_per_sec": 0 00:27:11.384 }, 00:27:11.384 "claimed": false, 00:27:11.384 "zoned": false, 00:27:11.384 "supported_io_types": { 00:27:11.384 "read": true, 00:27:11.384 "write": true, 00:27:11.384 "unmap": false, 00:27:11.384 "flush": false, 00:27:11.384 "reset": true, 00:27:11.384 "nvme_admin": false, 00:27:11.384 "nvme_io": false, 00:27:11.384 "nvme_io_md": false, 00:27:11.384 "write_zeroes": true, 00:27:11.384 "zcopy": false, 00:27:11.384 "get_zone_info": false, 00:27:11.384 "zone_management": false, 00:27:11.384 "zone_append": false, 00:27:11.384 "compare": false, 00:27:11.384 "compare_and_write": false, 00:27:11.384 "abort": false, 00:27:11.384 "seek_hole": false, 00:27:11.384 "seek_data": false, 00:27:11.384 "copy": false, 00:27:11.384 "nvme_iov_md": false 00:27:11.384 }, 00:27:11.384 "memory_domains": [ 00:27:11.384 { 00:27:11.384 "dma_device_id": "system", 00:27:11.384 "dma_device_type": 1 00:27:11.384 }, 00:27:11.384 { 00:27:11.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.384 "dma_device_type": 2 00:27:11.384 }, 00:27:11.384 { 00:27:11.384 "dma_device_id": "system", 00:27:11.385 "dma_device_type": 1 00:27:11.385 }, 00:27:11.385 { 00:27:11.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.385 "dma_device_type": 2 00:27:11.385 } 00:27:11.385 ], 00:27:11.385 "driver_specific": { 00:27:11.385 "raid": { 00:27:11.385 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:11.385 "strip_size_kb": 0, 00:27:11.385 "state": "online", 00:27:11.385 "raid_level": "raid1", 00:27:11.385 "superblock": true, 00:27:11.385 "num_base_bdevs": 2, 00:27:11.385 "num_base_bdevs_discovered": 2, 00:27:11.385 "num_base_bdevs_operational": 2, 00:27:11.385 "base_bdevs_list": [ 00:27:11.385 { 00:27:11.385 "name": "pt1", 00:27:11.385 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:11.385 "is_configured": true, 00:27:11.385 "data_offset": 256, 00:27:11.385 "data_size": 7936 00:27:11.385 }, 00:27:11.385 { 00:27:11.385 "name": "pt2", 00:27:11.385 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:11.385 "is_configured": true, 00:27:11.385 "data_offset": 256, 00:27:11.385 "data_size": 7936 00:27:11.385 } 00:27:11.385 ] 00:27:11.385 } 00:27:11.385 } 00:27:11.385 }' 00:27:11.385 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:11.641 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:11.641 pt2' 00:27:11.641 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:11.641 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:11.641 13:45:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.897 "name": "pt1", 00:27:11.897 "aliases": [ 00:27:11.897 "00000000-0000-0000-0000-000000000001" 00:27:11.897 ], 00:27:11.897 "product_name": "passthru", 00:27:11.897 "block_size": 4096, 00:27:11.897 "num_blocks": 8192, 00:27:11.897 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:11.897 "md_size": 32, 00:27:11.897 "md_interleave": false, 00:27:11.897 "dif_type": 0, 00:27:11.897 "assigned_rate_limits": { 00:27:11.897 "rw_ios_per_sec": 0, 00:27:11.897 "rw_mbytes_per_sec": 0, 00:27:11.897 "r_mbytes_per_sec": 0, 00:27:11.897 "w_mbytes_per_sec": 0 00:27:11.897 }, 00:27:11.897 "claimed": true, 00:27:11.897 "claim_type": "exclusive_write", 00:27:11.897 "zoned": false, 00:27:11.897 "supported_io_types": { 00:27:11.897 "read": true, 00:27:11.897 "write": true, 00:27:11.897 "unmap": true, 00:27:11.897 "flush": true, 00:27:11.897 "reset": true, 00:27:11.897 "nvme_admin": false, 00:27:11.897 "nvme_io": false, 00:27:11.897 "nvme_io_md": false, 00:27:11.897 "write_zeroes": true, 00:27:11.897 "zcopy": true, 00:27:11.897 "get_zone_info": false, 00:27:11.897 "zone_management": false, 00:27:11.897 "zone_append": false, 00:27:11.897 "compare": false, 00:27:11.897 "compare_and_write": false, 00:27:11.897 "abort": true, 00:27:11.897 "seek_hole": false, 00:27:11.897 "seek_data": false, 00:27:11.897 "copy": true, 00:27:11.897 "nvme_iov_md": false 00:27:11.897 }, 00:27:11.897 "memory_domains": [ 00:27:11.897 { 00:27:11.897 "dma_device_id": "system", 00:27:11.897 "dma_device_type": 1 00:27:11.897 }, 00:27:11.897 { 00:27:11.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.897 "dma_device_type": 2 00:27:11.897 } 00:27:11.897 ], 00:27:11.897 "driver_specific": { 00:27:11.897 "passthru": { 00:27:11.897 "name": "pt1", 00:27:11.897 "base_bdev_name": "malloc1" 00:27:11.897 } 00:27:11.897 } 00:27:11.897 }' 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:11.897 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:12.153 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:12.409 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:12.409 "name": "pt2", 00:27:12.409 "aliases": [ 00:27:12.409 "00000000-0000-0000-0000-000000000002" 00:27:12.409 ], 00:27:12.409 "product_name": "passthru", 00:27:12.409 "block_size": 4096, 00:27:12.409 "num_blocks": 8192, 00:27:12.409 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:12.409 "md_size": 32, 00:27:12.409 "md_interleave": false, 00:27:12.409 "dif_type": 0, 00:27:12.409 "assigned_rate_limits": { 00:27:12.409 "rw_ios_per_sec": 0, 00:27:12.409 "rw_mbytes_per_sec": 0, 00:27:12.409 "r_mbytes_per_sec": 0, 00:27:12.409 "w_mbytes_per_sec": 0 00:27:12.409 }, 00:27:12.409 "claimed": true, 00:27:12.409 "claim_type": "exclusive_write", 00:27:12.409 "zoned": false, 00:27:12.409 "supported_io_types": { 00:27:12.409 "read": true, 00:27:12.409 "write": true, 00:27:12.409 "unmap": true, 00:27:12.409 "flush": true, 00:27:12.409 "reset": true, 00:27:12.409 "nvme_admin": false, 00:27:12.409 "nvme_io": false, 00:27:12.409 "nvme_io_md": false, 00:27:12.409 "write_zeroes": true, 00:27:12.409 "zcopy": true, 00:27:12.409 "get_zone_info": false, 00:27:12.409 "zone_management": false, 00:27:12.409 "zone_append": false, 00:27:12.409 "compare": false, 00:27:12.409 "compare_and_write": false, 00:27:12.409 "abort": true, 00:27:12.409 "seek_hole": false, 00:27:12.409 "seek_data": false, 00:27:12.409 "copy": true, 00:27:12.409 "nvme_iov_md": false 00:27:12.409 }, 00:27:12.409 "memory_domains": [ 00:27:12.409 { 00:27:12.409 "dma_device_id": "system", 00:27:12.409 "dma_device_type": 1 00:27:12.409 }, 00:27:12.409 { 00:27:12.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:12.409 "dma_device_type": 2 00:27:12.409 } 00:27:12.409 ], 00:27:12.409 "driver_specific": { 00:27:12.409 "passthru": { 00:27:12.409 "name": "pt2", 00:27:12.409 "base_bdev_name": "malloc2" 00:27:12.409 } 00:27:12.409 } 00:27:12.409 }' 00:27:12.409 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.409 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.409 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:12.409 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:12.666 13:45:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.666 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.666 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:12.666 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:12.666 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:12.922 [2024-07-15 13:45:52.188641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:12.922 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 30cd1936-54e7-4962-a517-61cde7f78077 '!=' 30cd1936-54e7-4962-a517-61cde7f78077 ']' 00:27:12.922 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:12.922 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:12.922 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:12.922 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:13.179 [2024-07-15 13:45:52.429042] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.179 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.436 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.436 "name": "raid_bdev1", 00:27:13.436 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:13.436 "strip_size_kb": 0, 00:27:13.436 "state": "online", 00:27:13.436 "raid_level": "raid1", 00:27:13.436 "superblock": true, 00:27:13.436 "num_base_bdevs": 2, 00:27:13.436 "num_base_bdevs_discovered": 1, 00:27:13.436 "num_base_bdevs_operational": 1, 00:27:13.436 "base_bdevs_list": [ 00:27:13.436 { 00:27:13.436 "name": null, 00:27:13.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.436 "is_configured": false, 00:27:13.436 "data_offset": 256, 00:27:13.436 "data_size": 7936 00:27:13.436 }, 00:27:13.436 { 00:27:13.436 "name": "pt2", 00:27:13.436 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.436 "is_configured": true, 00:27:13.436 "data_offset": 256, 00:27:13.436 "data_size": 7936 00:27:13.436 } 00:27:13.436 ] 00:27:13.436 }' 00:27:13.436 13:45:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.436 13:45:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:13.998 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:14.254 [2024-07-15 13:45:53.431679] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:14.255 [2024-07-15 13:45:53.431709] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:14.255 [2024-07-15 13:45:53.431759] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:14.255 [2024-07-15 13:45:53.431803] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:14.255 [2024-07-15 13:45:53.431815] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e05d0 name raid_bdev1, state offline 00:27:14.255 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:14.255 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.511 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:14.511 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:14.511 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:14.511 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:14.511 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:27:14.769 13:45:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:14.769 [2024-07-15 13:45:54.189664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:14.769 [2024-07-15 13:45:54.189711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:14.769 [2024-07-15 13:45:54.189730] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24de660 00:27:14.769 [2024-07-15 13:45:54.189743] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:14.769 [2024-07-15 13:45:54.191467] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:14.769 [2024-07-15 13:45:54.191498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:14.769 [2024-07-15 13:45:54.191563] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:14.769 [2024-07-15 13:45:54.191590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:14.769 [2024-07-15 13:45:54.191674] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e0d10 00:27:14.769 [2024-07-15 13:45:54.191684] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:14.769 [2024-07-15 13:45:54.191745] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e1560 00:27:14.769 [2024-07-15 13:45:54.191848] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e0d10 00:27:14.769 [2024-07-15 13:45:54.191858] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24e0d10 00:27:14.769 [2024-07-15 13:45:54.191935] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.027 pt2 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.027 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.284 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.284 "name": "raid_bdev1", 00:27:15.284 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:15.284 "strip_size_kb": 0, 00:27:15.284 "state": "online", 00:27:15.284 "raid_level": "raid1", 00:27:15.284 "superblock": true, 00:27:15.284 "num_base_bdevs": 2, 00:27:15.284 "num_base_bdevs_discovered": 1, 00:27:15.284 "num_base_bdevs_operational": 1, 00:27:15.284 "base_bdevs_list": [ 00:27:15.284 { 00:27:15.284 "name": null, 00:27:15.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.284 "is_configured": false, 00:27:15.284 "data_offset": 256, 00:27:15.284 "data_size": 7936 00:27:15.284 }, 00:27:15.284 { 00:27:15.284 "name": "pt2", 00:27:15.284 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:15.284 "is_configured": true, 00:27:15.284 "data_offset": 256, 00:27:15.284 "data_size": 7936 00:27:15.284 } 00:27:15.284 ] 00:27:15.284 }' 00:27:15.284 13:45:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.284 13:45:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:15.849 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:16.106 [2024-07-15 13:45:55.296594] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:16.106 [2024-07-15 13:45:55.296624] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:16.106 [2024-07-15 13:45:55.296678] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:16.106 [2024-07-15 13:45:55.296725] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:16.106 [2024-07-15 13:45:55.296736] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e0d10 name raid_bdev1, state offline 00:27:16.106 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:16.106 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.365 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:16.365 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:16.365 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:16.365 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:16.365 [2024-07-15 13:45:55.781855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:16.365 [2024-07-15 13:45:55.781903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.365 [2024-07-15 13:45:55.781921] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24df760 00:27:16.365 [2024-07-15 13:45:55.781939] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.365 [2024-07-15 13:45:55.783694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.365 [2024-07-15 13:45:55.783723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:16.365 [2024-07-15 13:45:55.783777] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:16.365 [2024-07-15 13:45:55.783804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:16.365 [2024-07-15 13:45:55.783905] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:16.365 [2024-07-15 13:45:55.783919] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:16.365 [2024-07-15 13:45:55.783941] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e1850 name raid_bdev1, state configuring 00:27:16.365 [2024-07-15 13:45:55.783969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:16.365 [2024-07-15 13:45:55.784027] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e0850 00:27:16.365 [2024-07-15 13:45:55.784037] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:16.365 [2024-07-15 13:45:55.784106] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e13b0 00:27:16.365 [2024-07-15 13:45:55.784214] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e0850 00:27:16.365 [2024-07-15 13:45:55.784224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24e0850 00:27:16.365 [2024-07-15 13:45:55.784301] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.365 pt1 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.624 13:45:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.624 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.624 "name": "raid_bdev1", 00:27:16.624 "uuid": "30cd1936-54e7-4962-a517-61cde7f78077", 00:27:16.624 "strip_size_kb": 0, 00:27:16.624 "state": "online", 00:27:16.624 "raid_level": "raid1", 00:27:16.624 "superblock": true, 00:27:16.624 "num_base_bdevs": 2, 00:27:16.624 "num_base_bdevs_discovered": 1, 00:27:16.624 "num_base_bdevs_operational": 1, 00:27:16.624 "base_bdevs_list": [ 00:27:16.624 { 00:27:16.624 "name": null, 00:27:16.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.624 "is_configured": false, 00:27:16.624 "data_offset": 256, 00:27:16.624 "data_size": 7936 00:27:16.624 }, 00:27:16.624 { 00:27:16.624 "name": "pt2", 00:27:16.624 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:16.624 "is_configured": true, 00:27:16.624 "data_offset": 256, 00:27:16.624 "data_size": 7936 00:27:16.624 } 00:27:16.624 ] 00:27:16.624 }' 00:27:16.624 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.624 13:45:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:17.556 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:17.556 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:17.556 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:17.556 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:17.556 13:45:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:17.814 [2024-07-15 13:45:57.125683] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 30cd1936-54e7-4962-a517-61cde7f78077 '!=' 30cd1936-54e7-4962-a517-61cde7f78077 ']' 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2216574 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2216574 ']' 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2216574 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2216574 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2216574' 00:27:17.814 killing process with pid 2216574 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2216574 00:27:17.814 [2024-07-15 13:45:57.185909] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:17.814 [2024-07-15 13:45:57.185976] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:17.814 [2024-07-15 13:45:57.186023] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:17.814 [2024-07-15 13:45:57.186034] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e0850 name raid_bdev1, state offline 00:27:17.814 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2216574 00:27:17.814 [2024-07-15 13:45:57.228421] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:18.379 13:45:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:27:18.379 00:27:18.379 real 0m16.491s 00:27:18.379 user 0m29.845s 00:27:18.379 sys 0m2.924s 00:27:18.379 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:18.379 13:45:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:18.379 ************************************ 00:27:18.379 END TEST raid_superblock_test_md_separate 00:27:18.379 ************************************ 00:27:18.379 13:45:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:18.379 13:45:57 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:27:18.379 13:45:57 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:27:18.379 13:45:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:18.379 13:45:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:18.379 13:45:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:18.379 ************************************ 00:27:18.379 START TEST raid_rebuild_test_sb_md_separate 00:27:18.379 ************************************ 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2219003 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2219003 /var/tmp/spdk-raid.sock 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2219003 ']' 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:18.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:18.379 13:45:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:18.379 [2024-07-15 13:45:57.721222] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:27:18.379 [2024-07-15 13:45:57.721293] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2219003 ] 00:27:18.379 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:18.379 Zero copy mechanism will not be used. 00:27:18.636 [2024-07-15 13:45:57.851545] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:18.636 [2024-07-15 13:45:57.951522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:18.636 [2024-07-15 13:45:58.017019] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:18.636 [2024-07-15 13:45:58.017068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:19.196 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:19.196 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:19.196 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:19.196 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:19.454 BaseBdev1_malloc 00:27:19.454 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:19.710 [2024-07-15 13:45:58.927802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:19.710 [2024-07-15 13:45:58.927853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.710 [2024-07-15 13:45:58.927876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19e16d0 00:27:19.710 [2024-07-15 13:45:58.927889] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.710 [2024-07-15 13:45:58.929321] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.710 [2024-07-15 13:45:58.929350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:19.710 BaseBdev1 00:27:19.710 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:19.710 13:45:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:19.710 BaseBdev2_malloc 00:27:19.711 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:19.967 [2024-07-15 13:45:59.290249] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:19.967 [2024-07-15 13:45:59.290300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.967 [2024-07-15 13:45:59.290323] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b391f0 00:27:19.967 [2024-07-15 13:45:59.290340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.967 [2024-07-15 13:45:59.291649] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.967 [2024-07-15 13:45:59.291676] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:19.967 BaseBdev2 00:27:19.967 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:20.225 spare_malloc 00:27:20.225 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:20.482 spare_delay 00:27:20.482 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:20.482 [2024-07-15 13:45:59.877395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:20.482 [2024-07-15 13:45:59.877440] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.482 [2024-07-15 13:45:59.877462] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b357a0 00:27:20.482 [2024-07-15 13:45:59.877475] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.482 [2024-07-15 13:45:59.878724] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.482 [2024-07-15 13:45:59.878750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:20.482 spare 00:27:20.482 13:45:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:20.740 [2024-07-15 13:46:00.041962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:20.740 [2024-07-15 13:46:00.043494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:20.740 [2024-07-15 13:46:00.043661] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b361c0 00:27:20.740 [2024-07-15 13:46:00.043674] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:20.740 [2024-07-15 13:46:00.043746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a47360 00:27:20.740 [2024-07-15 13:46:00.043857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b361c0 00:27:20.740 [2024-07-15 13:46:00.043867] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b361c0 00:27:20.740 [2024-07-15 13:46:00.043946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.740 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.997 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.997 "name": "raid_bdev1", 00:27:20.997 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:20.997 "strip_size_kb": 0, 00:27:20.997 "state": "online", 00:27:20.997 "raid_level": "raid1", 00:27:20.997 "superblock": true, 00:27:20.997 "num_base_bdevs": 2, 00:27:20.997 "num_base_bdevs_discovered": 2, 00:27:20.997 "num_base_bdevs_operational": 2, 00:27:20.997 "base_bdevs_list": [ 00:27:20.997 { 00:27:20.997 "name": "BaseBdev1", 00:27:20.997 "uuid": "46a4f54b-6955-5ef0-be7c-e0cf1bd944bc", 00:27:20.997 "is_configured": true, 00:27:20.997 "data_offset": 256, 00:27:20.997 "data_size": 7936 00:27:20.997 }, 00:27:20.997 { 00:27:20.997 "name": "BaseBdev2", 00:27:20.997 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:20.997 "is_configured": true, 00:27:20.997 "data_offset": 256, 00:27:20.997 "data_size": 7936 00:27:20.997 } 00:27:20.997 ] 00:27:20.997 }' 00:27:20.997 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.997 13:46:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:21.930 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:21.930 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.187 [2024-07-15 13:46:01.413758] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.187 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:22.187 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.187 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.444 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:22.701 [2024-07-15 13:46:01.890824] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a47360 00:27:22.701 /dev/nbd0 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:22.701 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.702 1+0 records in 00:27:22.702 1+0 records out 00:27:22.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231816 s, 17.7 MB/s 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:22.702 13:46:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:23.633 7936+0 records in 00:27:23.633 7936+0 records out 00:27:23.633 32505856 bytes (33 MB, 31 MiB) copied, 0.750221 s, 43.3 MB/s 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.634 [2024-07-15 13:46:02.909305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.634 13:46:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:23.892 [2024-07-15 13:46:03.069772] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.892 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.893 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.893 "name": "raid_bdev1", 00:27:23.893 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:23.893 "strip_size_kb": 0, 00:27:23.893 "state": "online", 00:27:23.893 "raid_level": "raid1", 00:27:23.893 "superblock": true, 00:27:23.893 "num_base_bdevs": 2, 00:27:23.893 "num_base_bdevs_discovered": 1, 00:27:23.893 "num_base_bdevs_operational": 1, 00:27:23.893 "base_bdevs_list": [ 00:27:23.893 { 00:27:23.893 "name": null, 00:27:23.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.893 "is_configured": false, 00:27:23.893 "data_offset": 256, 00:27:23.893 "data_size": 7936 00:27:23.893 }, 00:27:23.893 { 00:27:23.893 "name": "BaseBdev2", 00:27:23.893 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:23.893 "is_configured": true, 00:27:23.893 "data_offset": 256, 00:27:23.893 "data_size": 7936 00:27:23.893 } 00:27:23.893 ] 00:27:23.893 }' 00:27:23.893 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.893 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:24.457 13:46:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:24.712 [2024-07-15 13:46:04.080478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.712 [2024-07-15 13:46:04.082946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19e0350 00:27:24.712 [2024-07-15 13:46:04.085315] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:24.712 13:46:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:26.078 "name": "raid_bdev1", 00:27:26.078 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:26.078 "strip_size_kb": 0, 00:27:26.078 "state": "online", 00:27:26.078 "raid_level": "raid1", 00:27:26.078 "superblock": true, 00:27:26.078 "num_base_bdevs": 2, 00:27:26.078 "num_base_bdevs_discovered": 2, 00:27:26.078 "num_base_bdevs_operational": 2, 00:27:26.078 "process": { 00:27:26.078 "type": "rebuild", 00:27:26.078 "target": "spare", 00:27:26.078 "progress": { 00:27:26.078 "blocks": 3072, 00:27:26.078 "percent": 38 00:27:26.078 } 00:27:26.078 }, 00:27:26.078 "base_bdevs_list": [ 00:27:26.078 { 00:27:26.078 "name": "spare", 00:27:26.078 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:26.078 "is_configured": true, 00:27:26.078 "data_offset": 256, 00:27:26.078 "data_size": 7936 00:27:26.078 }, 00:27:26.078 { 00:27:26.078 "name": "BaseBdev2", 00:27:26.078 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:26.078 "is_configured": true, 00:27:26.078 "data_offset": 256, 00:27:26.078 "data_size": 7936 00:27:26.078 } 00:27:26.078 ] 00:27:26.078 }' 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:26.078 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:26.334 [2024-07-15 13:46:05.686347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.334 [2024-07-15 13:46:05.698179] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:26.334 [2024-07-15 13:46:05.698226] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:26.334 [2024-07-15 13:46:05.698242] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.334 [2024-07-15 13:46:05.698251] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:26.334 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:26.334 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.335 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.591 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.591 "name": "raid_bdev1", 00:27:26.591 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:26.591 "strip_size_kb": 0, 00:27:26.591 "state": "online", 00:27:26.591 "raid_level": "raid1", 00:27:26.591 "superblock": true, 00:27:26.591 "num_base_bdevs": 2, 00:27:26.591 "num_base_bdevs_discovered": 1, 00:27:26.591 "num_base_bdevs_operational": 1, 00:27:26.591 "base_bdevs_list": [ 00:27:26.591 { 00:27:26.591 "name": null, 00:27:26.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.591 "is_configured": false, 00:27:26.591 "data_offset": 256, 00:27:26.591 "data_size": 7936 00:27:26.591 }, 00:27:26.591 { 00:27:26.591 "name": "BaseBdev2", 00:27:26.591 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:26.591 "is_configured": true, 00:27:26.591 "data_offset": 256, 00:27:26.591 "data_size": 7936 00:27:26.591 } 00:27:26.591 ] 00:27:26.591 }' 00:27:26.591 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.591 13:46:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.154 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.411 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.411 "name": "raid_bdev1", 00:27:27.411 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:27.411 "strip_size_kb": 0, 00:27:27.411 "state": "online", 00:27:27.411 "raid_level": "raid1", 00:27:27.411 "superblock": true, 00:27:27.411 "num_base_bdevs": 2, 00:27:27.411 "num_base_bdevs_discovered": 1, 00:27:27.411 "num_base_bdevs_operational": 1, 00:27:27.411 "base_bdevs_list": [ 00:27:27.411 { 00:27:27.411 "name": null, 00:27:27.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.411 "is_configured": false, 00:27:27.411 "data_offset": 256, 00:27:27.411 "data_size": 7936 00:27:27.411 }, 00:27:27.411 { 00:27:27.411 "name": "BaseBdev2", 00:27:27.411 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:27.411 "is_configured": true, 00:27:27.411 "data_offset": 256, 00:27:27.411 "data_size": 7936 00:27:27.411 } 00:27:27.411 ] 00:27:27.411 }' 00:27:27.411 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.668 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:27.668 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.668 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:27.668 13:46:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:27.924 [2024-07-15 13:46:07.133201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:27.924 [2024-07-15 13:46:07.135829] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19e1280 00:27:27.924 [2024-07-15 13:46:07.137474] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:27.924 13:46:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.855 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.111 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.111 "name": "raid_bdev1", 00:27:29.112 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:29.112 "strip_size_kb": 0, 00:27:29.112 "state": "online", 00:27:29.112 "raid_level": "raid1", 00:27:29.112 "superblock": true, 00:27:29.112 "num_base_bdevs": 2, 00:27:29.112 "num_base_bdevs_discovered": 2, 00:27:29.112 "num_base_bdevs_operational": 2, 00:27:29.112 "process": { 00:27:29.112 "type": "rebuild", 00:27:29.112 "target": "spare", 00:27:29.112 "progress": { 00:27:29.112 "blocks": 3072, 00:27:29.112 "percent": 38 00:27:29.112 } 00:27:29.112 }, 00:27:29.112 "base_bdevs_list": [ 00:27:29.112 { 00:27:29.112 "name": "spare", 00:27:29.112 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:29.112 "is_configured": true, 00:27:29.112 "data_offset": 256, 00:27:29.112 "data_size": 7936 00:27:29.112 }, 00:27:29.112 { 00:27:29.112 "name": "BaseBdev2", 00:27:29.112 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:29.112 "is_configured": true, 00:27:29.112 "data_offset": 256, 00:27:29.112 "data_size": 7936 00:27:29.112 } 00:27:29.112 ] 00:27:29.112 }' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:29.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1072 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.112 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.369 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.369 "name": "raid_bdev1", 00:27:29.369 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:29.369 "strip_size_kb": 0, 00:27:29.369 "state": "online", 00:27:29.369 "raid_level": "raid1", 00:27:29.369 "superblock": true, 00:27:29.369 "num_base_bdevs": 2, 00:27:29.369 "num_base_bdevs_discovered": 2, 00:27:29.369 "num_base_bdevs_operational": 2, 00:27:29.369 "process": { 00:27:29.369 "type": "rebuild", 00:27:29.369 "target": "spare", 00:27:29.369 "progress": { 00:27:29.369 "blocks": 3840, 00:27:29.369 "percent": 48 00:27:29.369 } 00:27:29.369 }, 00:27:29.369 "base_bdevs_list": [ 00:27:29.369 { 00:27:29.369 "name": "spare", 00:27:29.369 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:29.369 "is_configured": true, 00:27:29.369 "data_offset": 256, 00:27:29.369 "data_size": 7936 00:27:29.369 }, 00:27:29.369 { 00:27:29.369 "name": "BaseBdev2", 00:27:29.369 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:29.369 "is_configured": true, 00:27:29.369 "data_offset": 256, 00:27:29.369 "data_size": 7936 00:27:29.369 } 00:27:29.369 ] 00:27:29.369 }' 00:27:29.369 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.369 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.369 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.626 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.626 13:46:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.553 13:46:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:30.808 "name": "raid_bdev1", 00:27:30.808 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:30.808 "strip_size_kb": 0, 00:27:30.808 "state": "online", 00:27:30.808 "raid_level": "raid1", 00:27:30.808 "superblock": true, 00:27:30.808 "num_base_bdevs": 2, 00:27:30.808 "num_base_bdevs_discovered": 2, 00:27:30.808 "num_base_bdevs_operational": 2, 00:27:30.808 "process": { 00:27:30.808 "type": "rebuild", 00:27:30.808 "target": "spare", 00:27:30.808 "progress": { 00:27:30.808 "blocks": 7424, 00:27:30.808 "percent": 93 00:27:30.808 } 00:27:30.808 }, 00:27:30.808 "base_bdevs_list": [ 00:27:30.808 { 00:27:30.808 "name": "spare", 00:27:30.808 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:30.808 "is_configured": true, 00:27:30.808 "data_offset": 256, 00:27:30.808 "data_size": 7936 00:27:30.808 }, 00:27:30.808 { 00:27:30.808 "name": "BaseBdev2", 00:27:30.808 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:30.808 "is_configured": true, 00:27:30.808 "data_offset": 256, 00:27:30.808 "data_size": 7936 00:27:30.808 } 00:27:30.808 ] 00:27:30.808 }' 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:30.808 13:46:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:31.066 [2024-07-15 13:46:10.261766] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:31.066 [2024-07-15 13:46:10.261827] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:31.066 [2024-07-15 13:46:10.261909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.995 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:32.252 "name": "raid_bdev1", 00:27:32.252 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:32.252 "strip_size_kb": 0, 00:27:32.252 "state": "online", 00:27:32.252 "raid_level": "raid1", 00:27:32.252 "superblock": true, 00:27:32.252 "num_base_bdevs": 2, 00:27:32.252 "num_base_bdevs_discovered": 2, 00:27:32.252 "num_base_bdevs_operational": 2, 00:27:32.252 "base_bdevs_list": [ 00:27:32.252 { 00:27:32.252 "name": "spare", 00:27:32.252 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:32.252 "is_configured": true, 00:27:32.252 "data_offset": 256, 00:27:32.252 "data_size": 7936 00:27:32.252 }, 00:27:32.252 { 00:27:32.252 "name": "BaseBdev2", 00:27:32.252 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:32.252 "is_configured": true, 00:27:32.252 "data_offset": 256, 00:27:32.252 "data_size": 7936 00:27:32.252 } 00:27:32.252 ] 00:27:32.252 }' 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.252 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:32.509 "name": "raid_bdev1", 00:27:32.509 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:32.509 "strip_size_kb": 0, 00:27:32.509 "state": "online", 00:27:32.509 "raid_level": "raid1", 00:27:32.509 "superblock": true, 00:27:32.509 "num_base_bdevs": 2, 00:27:32.509 "num_base_bdevs_discovered": 2, 00:27:32.509 "num_base_bdevs_operational": 2, 00:27:32.509 "base_bdevs_list": [ 00:27:32.509 { 00:27:32.509 "name": "spare", 00:27:32.509 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:32.509 "is_configured": true, 00:27:32.509 "data_offset": 256, 00:27:32.509 "data_size": 7936 00:27:32.509 }, 00:27:32.509 { 00:27:32.509 "name": "BaseBdev2", 00:27:32.509 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:32.509 "is_configured": true, 00:27:32.509 "data_offset": 256, 00:27:32.509 "data_size": 7936 00:27:32.509 } 00:27:32.509 ] 00:27:32.509 }' 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.509 13:46:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.766 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.766 "name": "raid_bdev1", 00:27:32.766 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:32.766 "strip_size_kb": 0, 00:27:32.766 "state": "online", 00:27:32.766 "raid_level": "raid1", 00:27:32.766 "superblock": true, 00:27:32.766 "num_base_bdevs": 2, 00:27:32.766 "num_base_bdevs_discovered": 2, 00:27:32.766 "num_base_bdevs_operational": 2, 00:27:32.766 "base_bdevs_list": [ 00:27:32.766 { 00:27:32.766 "name": "spare", 00:27:32.766 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:32.766 "is_configured": true, 00:27:32.766 "data_offset": 256, 00:27:32.766 "data_size": 7936 00:27:32.766 }, 00:27:32.766 { 00:27:32.766 "name": "BaseBdev2", 00:27:32.766 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:32.766 "is_configured": true, 00:27:32.766 "data_offset": 256, 00:27:32.766 "data_size": 7936 00:27:32.766 } 00:27:32.766 ] 00:27:32.766 }' 00:27:32.766 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.766 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:33.328 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:33.585 [2024-07-15 13:46:12.896163] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:33.585 [2024-07-15 13:46:12.896196] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:33.585 [2024-07-15 13:46:12.896263] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:33.585 [2024-07-15 13:46:12.896328] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:33.585 [2024-07-15 13:46:12.896341] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b361c0 name raid_bdev1, state offline 00:27:33.585 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.585 13:46:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:33.841 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:33.842 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.842 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:34.099 /dev/nbd0 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:34.099 1+0 records in 00:27:34.099 1+0 records out 00:27:34.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223445 s, 18.3 MB/s 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:34.099 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:34.357 /dev/nbd1 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:34.357 1+0 records in 00:27:34.357 1+0 records out 00:27:34.357 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305912 s, 13.4 MB/s 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:34.357 13:46:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:34.621 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:34.878 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:35.136 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:35.394 [2024-07-15 13:46:14.697181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:35.394 [2024-07-15 13:46:14.697228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.394 [2024-07-15 13:46:14.697251] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b35f60 00:27:35.394 [2024-07-15 13:46:14.697268] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.394 [2024-07-15 13:46:14.698757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.394 [2024-07-15 13:46:14.698787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:35.394 [2024-07-15 13:46:14.698848] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:35.394 [2024-07-15 13:46:14.698875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:35.394 [2024-07-15 13:46:14.698982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:35.394 spare 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.394 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.394 [2024-07-15 13:46:14.799293] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a477c0 00:27:35.394 [2024-07-15 13:46:14.799310] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:35.394 [2024-07-15 13:46:14.799381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a47480 00:27:35.394 [2024-07-15 13:46:14.799503] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a477c0 00:27:35.394 [2024-07-15 13:46:14.799513] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a477c0 00:27:35.394 [2024-07-15 13:46:14.799590] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:35.652 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:35.652 "name": "raid_bdev1", 00:27:35.652 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:35.652 "strip_size_kb": 0, 00:27:35.652 "state": "online", 00:27:35.652 "raid_level": "raid1", 00:27:35.652 "superblock": true, 00:27:35.652 "num_base_bdevs": 2, 00:27:35.652 "num_base_bdevs_discovered": 2, 00:27:35.652 "num_base_bdevs_operational": 2, 00:27:35.652 "base_bdevs_list": [ 00:27:35.652 { 00:27:35.652 "name": "spare", 00:27:35.652 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:35.652 "is_configured": true, 00:27:35.652 "data_offset": 256, 00:27:35.652 "data_size": 7936 00:27:35.652 }, 00:27:35.652 { 00:27:35.652 "name": "BaseBdev2", 00:27:35.652 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:35.652 "is_configured": true, 00:27:35.652 "data_offset": 256, 00:27:35.652 "data_size": 7936 00:27:35.652 } 00:27:35.652 ] 00:27:35.652 }' 00:27:35.652 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:35.652 13:46:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.217 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.476 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:36.476 "name": "raid_bdev1", 00:27:36.477 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:36.477 "strip_size_kb": 0, 00:27:36.477 "state": "online", 00:27:36.477 "raid_level": "raid1", 00:27:36.477 "superblock": true, 00:27:36.477 "num_base_bdevs": 2, 00:27:36.477 "num_base_bdevs_discovered": 2, 00:27:36.477 "num_base_bdevs_operational": 2, 00:27:36.477 "base_bdevs_list": [ 00:27:36.477 { 00:27:36.477 "name": "spare", 00:27:36.477 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:36.477 "is_configured": true, 00:27:36.477 "data_offset": 256, 00:27:36.477 "data_size": 7936 00:27:36.477 }, 00:27:36.477 { 00:27:36.477 "name": "BaseBdev2", 00:27:36.477 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:36.477 "is_configured": true, 00:27:36.477 "data_offset": 256, 00:27:36.477 "data_size": 7936 00:27:36.477 } 00:27:36.477 ] 00:27:36.477 }' 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.477 13:46:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:36.736 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:36.736 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:36.993 [2024-07-15 13:46:16.237390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.993 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.250 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.250 "name": "raid_bdev1", 00:27:37.250 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:37.250 "strip_size_kb": 0, 00:27:37.250 "state": "online", 00:27:37.250 "raid_level": "raid1", 00:27:37.250 "superblock": true, 00:27:37.250 "num_base_bdevs": 2, 00:27:37.250 "num_base_bdevs_discovered": 1, 00:27:37.250 "num_base_bdevs_operational": 1, 00:27:37.250 "base_bdevs_list": [ 00:27:37.250 { 00:27:37.250 "name": null, 00:27:37.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.250 "is_configured": false, 00:27:37.250 "data_offset": 256, 00:27:37.250 "data_size": 7936 00:27:37.250 }, 00:27:37.250 { 00:27:37.250 "name": "BaseBdev2", 00:27:37.250 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:37.250 "is_configured": true, 00:27:37.250 "data_offset": 256, 00:27:37.250 "data_size": 7936 00:27:37.250 } 00:27:37.250 ] 00:27:37.250 }' 00:27:37.250 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.250 13:46:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:37.814 13:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:37.814 [2024-07-15 13:46:17.224025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:37.814 [2024-07-15 13:46:17.224188] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:37.814 [2024-07-15 13:46:17.224205] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:37.814 [2024-07-15 13:46:17.224234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:37.814 [2024-07-15 13:46:17.226430] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19e1280 00:27:37.814 [2024-07-15 13:46:17.227796] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:38.071 13:46:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:39.030 "name": "raid_bdev1", 00:27:39.030 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:39.030 "strip_size_kb": 0, 00:27:39.030 "state": "online", 00:27:39.030 "raid_level": "raid1", 00:27:39.030 "superblock": true, 00:27:39.030 "num_base_bdevs": 2, 00:27:39.030 "num_base_bdevs_discovered": 2, 00:27:39.030 "num_base_bdevs_operational": 2, 00:27:39.030 "process": { 00:27:39.030 "type": "rebuild", 00:27:39.030 "target": "spare", 00:27:39.030 "progress": { 00:27:39.030 "blocks": 2816, 00:27:39.030 "percent": 35 00:27:39.030 } 00:27:39.030 }, 00:27:39.030 "base_bdevs_list": [ 00:27:39.030 { 00:27:39.030 "name": "spare", 00:27:39.030 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:39.030 "is_configured": true, 00:27:39.030 "data_offset": 256, 00:27:39.030 "data_size": 7936 00:27:39.030 }, 00:27:39.030 { 00:27:39.030 "name": "BaseBdev2", 00:27:39.030 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:39.030 "is_configured": true, 00:27:39.030 "data_offset": 256, 00:27:39.030 "data_size": 7936 00:27:39.030 } 00:27:39.030 ] 00:27:39.030 }' 00:27:39.030 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:39.288 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:39.288 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:39.288 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:39.288 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:39.545 [2024-07-15 13:46:18.757501] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.545 [2024-07-15 13:46:18.840431] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:39.545 [2024-07-15 13:46:18.840478] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:39.545 [2024-07-15 13:46:18.840494] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.545 [2024-07-15 13:46:18.840503] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.545 13:46:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.802 13:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.802 "name": "raid_bdev1", 00:27:39.802 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:39.802 "strip_size_kb": 0, 00:27:39.802 "state": "online", 00:27:39.802 "raid_level": "raid1", 00:27:39.802 "superblock": true, 00:27:39.802 "num_base_bdevs": 2, 00:27:39.802 "num_base_bdevs_discovered": 1, 00:27:39.802 "num_base_bdevs_operational": 1, 00:27:39.802 "base_bdevs_list": [ 00:27:39.802 { 00:27:39.802 "name": null, 00:27:39.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.802 "is_configured": false, 00:27:39.802 "data_offset": 256, 00:27:39.802 "data_size": 7936 00:27:39.802 }, 00:27:39.802 { 00:27:39.802 "name": "BaseBdev2", 00:27:39.802 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:39.802 "is_configured": true, 00:27:39.802 "data_offset": 256, 00:27:39.802 "data_size": 7936 00:27:39.802 } 00:27:39.802 ] 00:27:39.802 }' 00:27:39.802 13:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.802 13:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:40.366 13:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:40.623 [2024-07-15 13:46:19.878316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:40.623 [2024-07-15 13:46:19.878369] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:40.623 [2024-07-15 13:46:19.878393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a46ca0 00:27:40.623 [2024-07-15 13:46:19.878406] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:40.623 [2024-07-15 13:46:19.878635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:40.623 [2024-07-15 13:46:19.878651] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:40.623 [2024-07-15 13:46:19.878737] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:40.623 [2024-07-15 13:46:19.878749] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:40.623 [2024-07-15 13:46:19.878766] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:40.623 [2024-07-15 13:46:19.878784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:40.623 [2024-07-15 13:46:19.880982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b37570 00:27:40.623 [2024-07-15 13:46:19.882347] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:40.623 spare 00:27:40.623 13:46:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.555 13:46:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.813 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.813 "name": "raid_bdev1", 00:27:41.813 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:41.813 "strip_size_kb": 0, 00:27:41.813 "state": "online", 00:27:41.813 "raid_level": "raid1", 00:27:41.813 "superblock": true, 00:27:41.813 "num_base_bdevs": 2, 00:27:41.813 "num_base_bdevs_discovered": 2, 00:27:41.813 "num_base_bdevs_operational": 2, 00:27:41.813 "process": { 00:27:41.813 "type": "rebuild", 00:27:41.813 "target": "spare", 00:27:41.813 "progress": { 00:27:41.813 "blocks": 3072, 00:27:41.813 "percent": 38 00:27:41.813 } 00:27:41.813 }, 00:27:41.813 "base_bdevs_list": [ 00:27:41.813 { 00:27:41.813 "name": "spare", 00:27:41.813 "uuid": "70bd00f5-83f7-5604-bb20-6ddade8c5439", 00:27:41.813 "is_configured": true, 00:27:41.813 "data_offset": 256, 00:27:41.813 "data_size": 7936 00:27:41.813 }, 00:27:41.813 { 00:27:41.813 "name": "BaseBdev2", 00:27:41.813 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:41.813 "is_configured": true, 00:27:41.813 "data_offset": 256, 00:27:41.813 "data_size": 7936 00:27:41.813 } 00:27:41.813 ] 00:27:41.813 }' 00:27:41.813 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.813 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:41.813 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.070 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:42.070 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:42.070 [2024-07-15 13:46:21.411043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.070 [2024-07-15 13:46:21.495061] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:42.070 [2024-07-15 13:46:21.495110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.070 [2024-07-15 13:46:21.495127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.070 [2024-07-15 13:46:21.495135] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.328 "name": "raid_bdev1", 00:27:42.328 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:42.328 "strip_size_kb": 0, 00:27:42.328 "state": "online", 00:27:42.328 "raid_level": "raid1", 00:27:42.328 "superblock": true, 00:27:42.328 "num_base_bdevs": 2, 00:27:42.328 "num_base_bdevs_discovered": 1, 00:27:42.328 "num_base_bdevs_operational": 1, 00:27:42.328 "base_bdevs_list": [ 00:27:42.328 { 00:27:42.328 "name": null, 00:27:42.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.328 "is_configured": false, 00:27:42.328 "data_offset": 256, 00:27:42.328 "data_size": 7936 00:27:42.328 }, 00:27:42.328 { 00:27:42.328 "name": "BaseBdev2", 00:27:42.328 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:42.328 "is_configured": true, 00:27:42.328 "data_offset": 256, 00:27:42.328 "data_size": 7936 00:27:42.328 } 00:27:42.328 ] 00:27:42.328 }' 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.328 13:46:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.893 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.150 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.150 "name": "raid_bdev1", 00:27:43.150 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:43.150 "strip_size_kb": 0, 00:27:43.150 "state": "online", 00:27:43.150 "raid_level": "raid1", 00:27:43.150 "superblock": true, 00:27:43.150 "num_base_bdevs": 2, 00:27:43.150 "num_base_bdevs_discovered": 1, 00:27:43.150 "num_base_bdevs_operational": 1, 00:27:43.150 "base_bdevs_list": [ 00:27:43.150 { 00:27:43.150 "name": null, 00:27:43.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.150 "is_configured": false, 00:27:43.150 "data_offset": 256, 00:27:43.150 "data_size": 7936 00:27:43.150 }, 00:27:43.150 { 00:27:43.150 "name": "BaseBdev2", 00:27:43.150 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:43.150 "is_configured": true, 00:27:43.150 "data_offset": 256, 00:27:43.150 "data_size": 7936 00:27:43.150 } 00:27:43.150 ] 00:27:43.150 }' 00:27:43.150 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.408 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:43.408 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.408 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:43.408 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:43.665 13:46:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:43.923 [2024-07-15 13:46:23.106567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:43.923 [2024-07-15 13:46:23.106619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:43.923 [2024-07-15 13:46:23.106642] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19e1900 00:27:43.923 [2024-07-15 13:46:23.106655] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:43.923 [2024-07-15 13:46:23.106879] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:43.923 [2024-07-15 13:46:23.106895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:43.923 [2024-07-15 13:46:23.106954] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:43.923 [2024-07-15 13:46:23.106966] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:43.923 [2024-07-15 13:46:23.106977] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:43.923 BaseBdev1 00:27:43.923 13:46:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:44.854 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.855 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.855 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.855 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.855 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.855 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.112 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.112 "name": "raid_bdev1", 00:27:45.112 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:45.112 "strip_size_kb": 0, 00:27:45.112 "state": "online", 00:27:45.112 "raid_level": "raid1", 00:27:45.112 "superblock": true, 00:27:45.112 "num_base_bdevs": 2, 00:27:45.112 "num_base_bdevs_discovered": 1, 00:27:45.112 "num_base_bdevs_operational": 1, 00:27:45.112 "base_bdevs_list": [ 00:27:45.112 { 00:27:45.112 "name": null, 00:27:45.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.112 "is_configured": false, 00:27:45.112 "data_offset": 256, 00:27:45.112 "data_size": 7936 00:27:45.112 }, 00:27:45.112 { 00:27:45.112 "name": "BaseBdev2", 00:27:45.112 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:45.112 "is_configured": true, 00:27:45.112 "data_offset": 256, 00:27:45.112 "data_size": 7936 00:27:45.112 } 00:27:45.112 ] 00:27:45.112 }' 00:27:45.112 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.112 13:46:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.043 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.300 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.300 "name": "raid_bdev1", 00:27:46.300 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:46.300 "strip_size_kb": 0, 00:27:46.300 "state": "online", 00:27:46.300 "raid_level": "raid1", 00:27:46.300 "superblock": true, 00:27:46.300 "num_base_bdevs": 2, 00:27:46.300 "num_base_bdevs_discovered": 1, 00:27:46.300 "num_base_bdevs_operational": 1, 00:27:46.300 "base_bdevs_list": [ 00:27:46.300 { 00:27:46.300 "name": null, 00:27:46.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.300 "is_configured": false, 00:27:46.300 "data_offset": 256, 00:27:46.300 "data_size": 7936 00:27:46.300 }, 00:27:46.300 { 00:27:46.300 "name": "BaseBdev2", 00:27:46.300 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:46.300 "is_configured": true, 00:27:46.300 "data_offset": 256, 00:27:46.300 "data_size": 7936 00:27:46.300 } 00:27:46.300 ] 00:27:46.300 }' 00:27:46.300 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.300 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:46.301 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:46.558 [2024-07-15 13:46:25.773664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:46.558 [2024-07-15 13:46:25.773814] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:46.558 [2024-07-15 13:46:25.773830] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:46.558 request: 00:27:46.558 { 00:27:46.558 "base_bdev": "BaseBdev1", 00:27:46.558 "raid_bdev": "raid_bdev1", 00:27:46.558 "method": "bdev_raid_add_base_bdev", 00:27:46.558 "req_id": 1 00:27:46.558 } 00:27:46.558 Got JSON-RPC error response 00:27:46.558 response: 00:27:46.558 { 00:27:46.558 "code": -22, 00:27:46.558 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:46.558 } 00:27:46.558 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:46.558 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:46.558 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:46.558 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:46.558 13:46:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.487 13:46:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.743 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.743 "name": "raid_bdev1", 00:27:47.743 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:47.743 "strip_size_kb": 0, 00:27:47.743 "state": "online", 00:27:47.743 "raid_level": "raid1", 00:27:47.743 "superblock": true, 00:27:47.743 "num_base_bdevs": 2, 00:27:47.743 "num_base_bdevs_discovered": 1, 00:27:47.743 "num_base_bdevs_operational": 1, 00:27:47.743 "base_bdevs_list": [ 00:27:47.743 { 00:27:47.743 "name": null, 00:27:47.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.743 "is_configured": false, 00:27:47.743 "data_offset": 256, 00:27:47.743 "data_size": 7936 00:27:47.743 }, 00:27:47.743 { 00:27:47.743 "name": "BaseBdev2", 00:27:47.743 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:47.743 "is_configured": true, 00:27:47.743 "data_offset": 256, 00:27:47.743 "data_size": 7936 00:27:47.743 } 00:27:47.743 ] 00:27:47.743 }' 00:27:47.743 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.743 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.306 13:46:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.868 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.868 "name": "raid_bdev1", 00:27:48.868 "uuid": "53a882db-b74f-4781-8fc7-85b0182e1a3e", 00:27:48.868 "strip_size_kb": 0, 00:27:48.868 "state": "online", 00:27:48.868 "raid_level": "raid1", 00:27:48.868 "superblock": true, 00:27:48.868 "num_base_bdevs": 2, 00:27:48.868 "num_base_bdevs_discovered": 1, 00:27:48.868 "num_base_bdevs_operational": 1, 00:27:48.868 "base_bdevs_list": [ 00:27:48.868 { 00:27:48.868 "name": null, 00:27:48.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.868 "is_configured": false, 00:27:48.868 "data_offset": 256, 00:27:48.868 "data_size": 7936 00:27:48.868 }, 00:27:48.868 { 00:27:48.868 "name": "BaseBdev2", 00:27:48.868 "uuid": "ce1f2a75-5921-5a32-9501-45ae0bd185a2", 00:27:48.868 "is_configured": true, 00:27:48.868 "data_offset": 256, 00:27:48.868 "data_size": 7936 00:27:48.868 } 00:27:48.868 ] 00:27:48.868 }' 00:27:48.868 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.868 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2219003 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2219003 ']' 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2219003 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:48.869 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2219003 00:27:49.126 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:49.126 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:49.126 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2219003' 00:27:49.126 killing process with pid 2219003 00:27:49.126 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2219003 00:27:49.126 Received shutdown signal, test time was about 60.000000 seconds 00:27:49.126 00:27:49.126 Latency(us) 00:27:49.126 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:49.126 =================================================================================================================== 00:27:49.126 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:49.126 [2024-07-15 13:46:28.328104] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:49.126 [2024-07-15 13:46:28.328205] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:49.126 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2219003 00:27:49.126 [2024-07-15 13:46:28.328257] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:49.126 [2024-07-15 13:46:28.328270] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a477c0 name raid_bdev1, state offline 00:27:49.126 [2024-07-15 13:46:28.367373] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:49.383 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:27:49.383 00:27:49.383 real 0m30.945s 00:27:49.383 user 0m47.965s 00:27:49.383 sys 0m5.012s 00:27:49.383 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:49.383 13:46:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:49.383 ************************************ 00:27:49.383 END TEST raid_rebuild_test_sb_md_separate 00:27:49.383 ************************************ 00:27:49.383 13:46:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:49.383 13:46:28 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:27:49.383 13:46:28 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:27:49.383 13:46:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:49.383 13:46:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:49.383 13:46:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:49.383 ************************************ 00:27:49.383 START TEST raid_state_function_test_sb_md_interleaved 00:27:49.383 ************************************ 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2223510 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2223510' 00:27:49.383 Process raid pid: 2223510 00:27:49.383 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2223510 /var/tmp/spdk-raid.sock 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2223510 ']' 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:49.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:49.384 13:46:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:49.384 [2024-07-15 13:46:28.729274] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:27:49.384 [2024-07-15 13:46:28.729338] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:49.641 [2024-07-15 13:46:28.859126] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.641 [2024-07-15 13:46:28.962003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.641 [2024-07-15 13:46:29.019345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:49.641 [2024-07-15 13:46:29.019379] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:50.573 [2024-07-15 13:46:29.895728] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:50.573 [2024-07-15 13:46:29.895772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:50.573 [2024-07-15 13:46:29.895783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:50.573 [2024-07-15 13:46:29.895795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:50.573 13:46:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.831 13:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.831 "name": "Existed_Raid", 00:27:50.831 "uuid": "d413ffd2-3605-4e42-828d-9931b0f39a0b", 00:27:50.831 "strip_size_kb": 0, 00:27:50.831 "state": "configuring", 00:27:50.831 "raid_level": "raid1", 00:27:50.831 "superblock": true, 00:27:50.831 "num_base_bdevs": 2, 00:27:50.831 "num_base_bdevs_discovered": 0, 00:27:50.831 "num_base_bdevs_operational": 2, 00:27:50.831 "base_bdevs_list": [ 00:27:50.831 { 00:27:50.831 "name": "BaseBdev1", 00:27:50.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.831 "is_configured": false, 00:27:50.831 "data_offset": 0, 00:27:50.831 "data_size": 0 00:27:50.831 }, 00:27:50.831 { 00:27:50.831 "name": "BaseBdev2", 00:27:50.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.831 "is_configured": false, 00:27:50.831 "data_offset": 0, 00:27:50.831 "data_size": 0 00:27:50.831 } 00:27:50.831 ] 00:27:50.831 }' 00:27:50.831 13:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.831 13:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:51.395 13:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:51.652 [2024-07-15 13:46:30.974439] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:51.652 [2024-07-15 13:46:30.974473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19faa80 name Existed_Raid, state configuring 00:27:51.652 13:46:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:51.909 [2024-07-15 13:46:31.150931] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:51.909 [2024-07-15 13:46:31.150965] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:51.909 [2024-07-15 13:46:31.150975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:51.909 [2024-07-15 13:46:31.150987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:51.909 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:27:52.167 [2024-07-15 13:46:31.409527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:52.167 BaseBdev1 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:52.167 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:52.425 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:52.682 [ 00:27:52.682 { 00:27:52.682 "name": "BaseBdev1", 00:27:52.682 "aliases": [ 00:27:52.682 "07ba6c45-788e-4be2-8f5e-36ab6044278b" 00:27:52.682 ], 00:27:52.682 "product_name": "Malloc disk", 00:27:52.682 "block_size": 4128, 00:27:52.682 "num_blocks": 8192, 00:27:52.682 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:52.682 "md_size": 32, 00:27:52.682 "md_interleave": true, 00:27:52.682 "dif_type": 0, 00:27:52.682 "assigned_rate_limits": { 00:27:52.682 "rw_ios_per_sec": 0, 00:27:52.682 "rw_mbytes_per_sec": 0, 00:27:52.682 "r_mbytes_per_sec": 0, 00:27:52.682 "w_mbytes_per_sec": 0 00:27:52.682 }, 00:27:52.682 "claimed": true, 00:27:52.682 "claim_type": "exclusive_write", 00:27:52.682 "zoned": false, 00:27:52.682 "supported_io_types": { 00:27:52.682 "read": true, 00:27:52.682 "write": true, 00:27:52.682 "unmap": true, 00:27:52.682 "flush": true, 00:27:52.682 "reset": true, 00:27:52.682 "nvme_admin": false, 00:27:52.682 "nvme_io": false, 00:27:52.682 "nvme_io_md": false, 00:27:52.682 "write_zeroes": true, 00:27:52.682 "zcopy": true, 00:27:52.682 "get_zone_info": false, 00:27:52.682 "zone_management": false, 00:27:52.682 "zone_append": false, 00:27:52.682 "compare": false, 00:27:52.682 "compare_and_write": false, 00:27:52.682 "abort": true, 00:27:52.683 "seek_hole": false, 00:27:52.683 "seek_data": false, 00:27:52.683 "copy": true, 00:27:52.683 "nvme_iov_md": false 00:27:52.683 }, 00:27:52.683 "memory_domains": [ 00:27:52.683 { 00:27:52.683 "dma_device_id": "system", 00:27:52.683 "dma_device_type": 1 00:27:52.683 }, 00:27:52.683 { 00:27:52.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.683 "dma_device_type": 2 00:27:52.683 } 00:27:52.683 ], 00:27:52.683 "driver_specific": {} 00:27:52.683 } 00:27:52.683 ] 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.683 13:46:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:52.940 13:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.940 "name": "Existed_Raid", 00:27:52.940 "uuid": "7f0e60f7-61c1-4a0e-8d3a-f48250ad1afe", 00:27:52.940 "strip_size_kb": 0, 00:27:52.940 "state": "configuring", 00:27:52.940 "raid_level": "raid1", 00:27:52.940 "superblock": true, 00:27:52.940 "num_base_bdevs": 2, 00:27:52.940 "num_base_bdevs_discovered": 1, 00:27:52.940 "num_base_bdevs_operational": 2, 00:27:52.940 "base_bdevs_list": [ 00:27:52.940 { 00:27:52.940 "name": "BaseBdev1", 00:27:52.940 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:52.940 "is_configured": true, 00:27:52.940 "data_offset": 256, 00:27:52.940 "data_size": 7936 00:27:52.940 }, 00:27:52.940 { 00:27:52.940 "name": "BaseBdev2", 00:27:52.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.940 "is_configured": false, 00:27:52.940 "data_offset": 0, 00:27:52.940 "data_size": 0 00:27:52.940 } 00:27:52.940 ] 00:27:52.940 }' 00:27:52.940 13:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.940 13:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:53.544 13:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:53.544 [2024-07-15 13:46:32.929596] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:53.544 [2024-07-15 13:46:32.929639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19fa350 name Existed_Raid, state configuring 00:27:53.544 13:46:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:53.806 [2024-07-15 13:46:33.174280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:53.806 [2024-07-15 13:46:33.175764] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:53.806 [2024-07-15 13:46:33.175797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.806 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.064 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.064 "name": "Existed_Raid", 00:27:54.064 "uuid": "81f43614-30d0-4953-bc9f-0da547b70fd5", 00:27:54.064 "strip_size_kb": 0, 00:27:54.064 "state": "configuring", 00:27:54.064 "raid_level": "raid1", 00:27:54.064 "superblock": true, 00:27:54.064 "num_base_bdevs": 2, 00:27:54.064 "num_base_bdevs_discovered": 1, 00:27:54.064 "num_base_bdevs_operational": 2, 00:27:54.064 "base_bdevs_list": [ 00:27:54.064 { 00:27:54.064 "name": "BaseBdev1", 00:27:54.064 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:54.064 "is_configured": true, 00:27:54.064 "data_offset": 256, 00:27:54.064 "data_size": 7936 00:27:54.064 }, 00:27:54.064 { 00:27:54.064 "name": "BaseBdev2", 00:27:54.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.064 "is_configured": false, 00:27:54.064 "data_offset": 0, 00:27:54.064 "data_size": 0 00:27:54.064 } 00:27:54.064 ] 00:27:54.064 }' 00:27:54.064 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.064 13:46:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:54.630 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:27:54.888 [2024-07-15 13:46:34.268722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:54.888 [2024-07-15 13:46:34.268855] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19fc180 00:27:54.888 [2024-07-15 13:46:34.268868] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:54.888 [2024-07-15 13:46:34.268940] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19fc150 00:27:54.888 [2024-07-15 13:46:34.269017] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19fc180 00:27:54.888 [2024-07-15 13:46:34.269027] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19fc180 00:27:54.888 [2024-07-15 13:46:34.269083] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.888 BaseBdev2 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:54.888 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:55.146 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:55.404 [ 00:27:55.404 { 00:27:55.404 "name": "BaseBdev2", 00:27:55.404 "aliases": [ 00:27:55.404 "545ecfbb-429c-418b-8feb-f91c636fb040" 00:27:55.404 ], 00:27:55.404 "product_name": "Malloc disk", 00:27:55.404 "block_size": 4128, 00:27:55.404 "num_blocks": 8192, 00:27:55.404 "uuid": "545ecfbb-429c-418b-8feb-f91c636fb040", 00:27:55.404 "md_size": 32, 00:27:55.404 "md_interleave": true, 00:27:55.404 "dif_type": 0, 00:27:55.404 "assigned_rate_limits": { 00:27:55.404 "rw_ios_per_sec": 0, 00:27:55.404 "rw_mbytes_per_sec": 0, 00:27:55.404 "r_mbytes_per_sec": 0, 00:27:55.404 "w_mbytes_per_sec": 0 00:27:55.404 }, 00:27:55.404 "claimed": true, 00:27:55.404 "claim_type": "exclusive_write", 00:27:55.404 "zoned": false, 00:27:55.404 "supported_io_types": { 00:27:55.404 "read": true, 00:27:55.404 "write": true, 00:27:55.404 "unmap": true, 00:27:55.404 "flush": true, 00:27:55.404 "reset": true, 00:27:55.404 "nvme_admin": false, 00:27:55.404 "nvme_io": false, 00:27:55.404 "nvme_io_md": false, 00:27:55.404 "write_zeroes": true, 00:27:55.404 "zcopy": true, 00:27:55.404 "get_zone_info": false, 00:27:55.404 "zone_management": false, 00:27:55.404 "zone_append": false, 00:27:55.404 "compare": false, 00:27:55.405 "compare_and_write": false, 00:27:55.405 "abort": true, 00:27:55.405 "seek_hole": false, 00:27:55.405 "seek_data": false, 00:27:55.405 "copy": true, 00:27:55.405 "nvme_iov_md": false 00:27:55.405 }, 00:27:55.405 "memory_domains": [ 00:27:55.405 { 00:27:55.405 "dma_device_id": "system", 00:27:55.405 "dma_device_type": 1 00:27:55.405 }, 00:27:55.405 { 00:27:55.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:55.405 "dma_device_type": 2 00:27:55.405 } 00:27:55.405 ], 00:27:55.405 "driver_specific": {} 00:27:55.405 } 00:27:55.405 ] 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.405 13:46:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:55.664 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.664 "name": "Existed_Raid", 00:27:55.664 "uuid": "81f43614-30d0-4953-bc9f-0da547b70fd5", 00:27:55.664 "strip_size_kb": 0, 00:27:55.664 "state": "online", 00:27:55.664 "raid_level": "raid1", 00:27:55.664 "superblock": true, 00:27:55.664 "num_base_bdevs": 2, 00:27:55.664 "num_base_bdevs_discovered": 2, 00:27:55.664 "num_base_bdevs_operational": 2, 00:27:55.664 "base_bdevs_list": [ 00:27:55.664 { 00:27:55.664 "name": "BaseBdev1", 00:27:55.664 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:55.664 "is_configured": true, 00:27:55.664 "data_offset": 256, 00:27:55.664 "data_size": 7936 00:27:55.664 }, 00:27:55.664 { 00:27:55.664 "name": "BaseBdev2", 00:27:55.664 "uuid": "545ecfbb-429c-418b-8feb-f91c636fb040", 00:27:55.664 "is_configured": true, 00:27:55.664 "data_offset": 256, 00:27:55.664 "data_size": 7936 00:27:55.664 } 00:27:55.664 ] 00:27:55.664 }' 00:27:55.664 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.664 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:56.229 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:56.487 [2024-07-15 13:46:35.849237] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:56.487 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:56.487 "name": "Existed_Raid", 00:27:56.487 "aliases": [ 00:27:56.487 "81f43614-30d0-4953-bc9f-0da547b70fd5" 00:27:56.487 ], 00:27:56.487 "product_name": "Raid Volume", 00:27:56.487 "block_size": 4128, 00:27:56.487 "num_blocks": 7936, 00:27:56.487 "uuid": "81f43614-30d0-4953-bc9f-0da547b70fd5", 00:27:56.487 "md_size": 32, 00:27:56.487 "md_interleave": true, 00:27:56.487 "dif_type": 0, 00:27:56.487 "assigned_rate_limits": { 00:27:56.487 "rw_ios_per_sec": 0, 00:27:56.487 "rw_mbytes_per_sec": 0, 00:27:56.487 "r_mbytes_per_sec": 0, 00:27:56.487 "w_mbytes_per_sec": 0 00:27:56.487 }, 00:27:56.487 "claimed": false, 00:27:56.487 "zoned": false, 00:27:56.487 "supported_io_types": { 00:27:56.487 "read": true, 00:27:56.487 "write": true, 00:27:56.487 "unmap": false, 00:27:56.487 "flush": false, 00:27:56.487 "reset": true, 00:27:56.487 "nvme_admin": false, 00:27:56.487 "nvme_io": false, 00:27:56.487 "nvme_io_md": false, 00:27:56.488 "write_zeroes": true, 00:27:56.488 "zcopy": false, 00:27:56.488 "get_zone_info": false, 00:27:56.488 "zone_management": false, 00:27:56.488 "zone_append": false, 00:27:56.488 "compare": false, 00:27:56.488 "compare_and_write": false, 00:27:56.488 "abort": false, 00:27:56.488 "seek_hole": false, 00:27:56.488 "seek_data": false, 00:27:56.488 "copy": false, 00:27:56.488 "nvme_iov_md": false 00:27:56.488 }, 00:27:56.488 "memory_domains": [ 00:27:56.488 { 00:27:56.488 "dma_device_id": "system", 00:27:56.488 "dma_device_type": 1 00:27:56.488 }, 00:27:56.488 { 00:27:56.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.488 "dma_device_type": 2 00:27:56.488 }, 00:27:56.488 { 00:27:56.488 "dma_device_id": "system", 00:27:56.488 "dma_device_type": 1 00:27:56.488 }, 00:27:56.488 { 00:27:56.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.488 "dma_device_type": 2 00:27:56.488 } 00:27:56.488 ], 00:27:56.488 "driver_specific": { 00:27:56.488 "raid": { 00:27:56.488 "uuid": "81f43614-30d0-4953-bc9f-0da547b70fd5", 00:27:56.488 "strip_size_kb": 0, 00:27:56.488 "state": "online", 00:27:56.488 "raid_level": "raid1", 00:27:56.488 "superblock": true, 00:27:56.488 "num_base_bdevs": 2, 00:27:56.488 "num_base_bdevs_discovered": 2, 00:27:56.488 "num_base_bdevs_operational": 2, 00:27:56.488 "base_bdevs_list": [ 00:27:56.488 { 00:27:56.488 "name": "BaseBdev1", 00:27:56.488 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:56.488 "is_configured": true, 00:27:56.488 "data_offset": 256, 00:27:56.488 "data_size": 7936 00:27:56.488 }, 00:27:56.488 { 00:27:56.488 "name": "BaseBdev2", 00:27:56.488 "uuid": "545ecfbb-429c-418b-8feb-f91c636fb040", 00:27:56.488 "is_configured": true, 00:27:56.488 "data_offset": 256, 00:27:56.488 "data_size": 7936 00:27:56.488 } 00:27:56.488 ] 00:27:56.488 } 00:27:56.488 } 00:27:56.488 }' 00:27:56.488 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:56.747 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:56.747 BaseBdev2' 00:27:56.747 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:56.747 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:56.747 13:46:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:56.747 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:56.747 "name": "BaseBdev1", 00:27:56.747 "aliases": [ 00:27:56.747 "07ba6c45-788e-4be2-8f5e-36ab6044278b" 00:27:56.747 ], 00:27:56.747 "product_name": "Malloc disk", 00:27:56.747 "block_size": 4128, 00:27:56.747 "num_blocks": 8192, 00:27:56.747 "uuid": "07ba6c45-788e-4be2-8f5e-36ab6044278b", 00:27:56.747 "md_size": 32, 00:27:56.747 "md_interleave": true, 00:27:56.747 "dif_type": 0, 00:27:56.747 "assigned_rate_limits": { 00:27:56.747 "rw_ios_per_sec": 0, 00:27:56.747 "rw_mbytes_per_sec": 0, 00:27:56.747 "r_mbytes_per_sec": 0, 00:27:56.747 "w_mbytes_per_sec": 0 00:27:56.747 }, 00:27:56.747 "claimed": true, 00:27:56.747 "claim_type": "exclusive_write", 00:27:56.747 "zoned": false, 00:27:56.747 "supported_io_types": { 00:27:56.747 "read": true, 00:27:56.747 "write": true, 00:27:56.747 "unmap": true, 00:27:56.747 "flush": true, 00:27:56.747 "reset": true, 00:27:56.747 "nvme_admin": false, 00:27:56.747 "nvme_io": false, 00:27:56.747 "nvme_io_md": false, 00:27:56.747 "write_zeroes": true, 00:27:56.747 "zcopy": true, 00:27:56.747 "get_zone_info": false, 00:27:56.747 "zone_management": false, 00:27:56.747 "zone_append": false, 00:27:56.747 "compare": false, 00:27:56.747 "compare_and_write": false, 00:27:56.747 "abort": true, 00:27:56.747 "seek_hole": false, 00:27:56.747 "seek_data": false, 00:27:56.747 "copy": true, 00:27:56.747 "nvme_iov_md": false 00:27:56.747 }, 00:27:56.747 "memory_domains": [ 00:27:56.747 { 00:27:56.747 "dma_device_id": "system", 00:27:56.747 "dma_device_type": 1 00:27:56.747 }, 00:27:56.747 { 00:27:56.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.747 "dma_device_type": 2 00:27:56.747 } 00:27:56.747 ], 00:27:56.747 "driver_specific": {} 00:27:56.747 }' 00:27:56.747 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.007 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:57.265 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:57.533 "name": "BaseBdev2", 00:27:57.533 "aliases": [ 00:27:57.533 "545ecfbb-429c-418b-8feb-f91c636fb040" 00:27:57.533 ], 00:27:57.533 "product_name": "Malloc disk", 00:27:57.533 "block_size": 4128, 00:27:57.533 "num_blocks": 8192, 00:27:57.533 "uuid": "545ecfbb-429c-418b-8feb-f91c636fb040", 00:27:57.533 "md_size": 32, 00:27:57.533 "md_interleave": true, 00:27:57.533 "dif_type": 0, 00:27:57.533 "assigned_rate_limits": { 00:27:57.533 "rw_ios_per_sec": 0, 00:27:57.533 "rw_mbytes_per_sec": 0, 00:27:57.533 "r_mbytes_per_sec": 0, 00:27:57.533 "w_mbytes_per_sec": 0 00:27:57.533 }, 00:27:57.533 "claimed": true, 00:27:57.533 "claim_type": "exclusive_write", 00:27:57.533 "zoned": false, 00:27:57.533 "supported_io_types": { 00:27:57.533 "read": true, 00:27:57.533 "write": true, 00:27:57.533 "unmap": true, 00:27:57.533 "flush": true, 00:27:57.533 "reset": true, 00:27:57.533 "nvme_admin": false, 00:27:57.533 "nvme_io": false, 00:27:57.533 "nvme_io_md": false, 00:27:57.533 "write_zeroes": true, 00:27:57.533 "zcopy": true, 00:27:57.533 "get_zone_info": false, 00:27:57.533 "zone_management": false, 00:27:57.533 "zone_append": false, 00:27:57.533 "compare": false, 00:27:57.533 "compare_and_write": false, 00:27:57.533 "abort": true, 00:27:57.533 "seek_hole": false, 00:27:57.533 "seek_data": false, 00:27:57.533 "copy": true, 00:27:57.533 "nvme_iov_md": false 00:27:57.533 }, 00:27:57.533 "memory_domains": [ 00:27:57.533 { 00:27:57.533 "dma_device_id": "system", 00:27:57.533 "dma_device_type": 1 00:27:57.533 }, 00:27:57.533 { 00:27:57.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:57.533 "dma_device_type": 2 00:27:57.533 } 00:27:57.533 ], 00:27:57.533 "driver_specific": {} 00:27:57.533 }' 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:57.533 13:46:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:57.790 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:58.048 [2024-07-15 13:46:37.369037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.048 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:58.305 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.305 "name": "Existed_Raid", 00:27:58.305 "uuid": "81f43614-30d0-4953-bc9f-0da547b70fd5", 00:27:58.305 "strip_size_kb": 0, 00:27:58.305 "state": "online", 00:27:58.305 "raid_level": "raid1", 00:27:58.305 "superblock": true, 00:27:58.306 "num_base_bdevs": 2, 00:27:58.306 "num_base_bdevs_discovered": 1, 00:27:58.306 "num_base_bdevs_operational": 1, 00:27:58.306 "base_bdevs_list": [ 00:27:58.306 { 00:27:58.306 "name": null, 00:27:58.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.306 "is_configured": false, 00:27:58.306 "data_offset": 256, 00:27:58.306 "data_size": 7936 00:27:58.306 }, 00:27:58.306 { 00:27:58.306 "name": "BaseBdev2", 00:27:58.306 "uuid": "545ecfbb-429c-418b-8feb-f91c636fb040", 00:27:58.306 "is_configured": true, 00:27:58.306 "data_offset": 256, 00:27:58.306 "data_size": 7936 00:27:58.306 } 00:27:58.306 ] 00:27:58.306 }' 00:27:58.306 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.306 13:46:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:58.869 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:58.869 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:58.869 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:58.869 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.127 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:59.127 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:59.127 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:59.384 [2024-07-15 13:46:38.654415] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:59.384 [2024-07-15 13:46:38.654501] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:59.384 [2024-07-15 13:46:38.665858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:59.384 [2024-07-15 13:46:38.665892] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:59.384 [2024-07-15 13:46:38.665904] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19fc180 name Existed_Raid, state offline 00:27:59.384 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:59.384 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:59.384 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.384 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2223510 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2223510 ']' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2223510 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2223510 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2223510' 00:27:59.641 killing process with pid 2223510 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2223510 00:27:59.641 [2024-07-15 13:46:38.982061] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:59.641 13:46:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2223510 00:27:59.641 [2024-07-15 13:46:38.982971] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:59.899 13:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:27:59.899 00:27:59.899 real 0m10.536s 00:27:59.899 user 0m18.766s 00:27:59.899 sys 0m1.922s 00:27:59.899 13:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:59.899 13:46:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:59.899 ************************************ 00:27:59.899 END TEST raid_state_function_test_sb_md_interleaved 00:27:59.899 ************************************ 00:27:59.899 13:46:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:59.899 13:46:39 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:27:59.899 13:46:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:59.899 13:46:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:59.899 13:46:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:59.899 ************************************ 00:27:59.899 START TEST raid_superblock_test_md_interleaved 00:27:59.899 ************************************ 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2225116 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2225116 /var/tmp/spdk-raid.sock 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2225116 ']' 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:59.899 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:59.899 13:46:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:00.157 [2024-07-15 13:46:39.342630] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:00.157 [2024-07-15 13:46:39.342696] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2225116 ] 00:28:00.157 [2024-07-15 13:46:39.472656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:00.157 [2024-07-15 13:46:39.578631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:00.414 [2024-07-15 13:46:39.647739] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:00.414 [2024-07-15 13:46:39.647776] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:00.993 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:28:01.251 malloc1 00:28:01.251 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:01.251 [2024-07-15 13:46:40.662775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:01.251 [2024-07-15 13:46:40.662825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.251 [2024-07-15 13:46:40.662855] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17844e0 00:28:01.251 [2024-07-15 13:46:40.662868] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.251 [2024-07-15 13:46:40.664416] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.251 [2024-07-15 13:46:40.664443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:01.251 pt1 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:28:01.509 malloc2 00:28:01.509 13:46:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:01.767 [2024-07-15 13:46:41.149072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:01.767 [2024-07-15 13:46:41.149120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.767 [2024-07-15 13:46:41.149141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1769570 00:28:01.767 [2024-07-15 13:46:41.149155] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.767 [2024-07-15 13:46:41.150634] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.767 [2024-07-15 13:46:41.150662] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:01.767 pt2 00:28:01.767 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:01.767 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:01.767 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:02.024 [2024-07-15 13:46:41.309520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:02.024 [2024-07-15 13:46:41.311012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:02.024 [2024-07-15 13:46:41.311169] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x176af20 00:28:02.024 [2024-07-15 13:46:41.311182] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:02.024 [2024-07-15 13:46:41.311254] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e7050 00:28:02.024 [2024-07-15 13:46:41.311340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x176af20 00:28:02.024 [2024-07-15 13:46:41.311350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x176af20 00:28:02.024 [2024-07-15 13:46:41.311410] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.024 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.282 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.282 "name": "raid_bdev1", 00:28:02.282 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:02.282 "strip_size_kb": 0, 00:28:02.282 "state": "online", 00:28:02.282 "raid_level": "raid1", 00:28:02.282 "superblock": true, 00:28:02.282 "num_base_bdevs": 2, 00:28:02.282 "num_base_bdevs_discovered": 2, 00:28:02.282 "num_base_bdevs_operational": 2, 00:28:02.282 "base_bdevs_list": [ 00:28:02.282 { 00:28:02.282 "name": "pt1", 00:28:02.282 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:02.282 "is_configured": true, 00:28:02.282 "data_offset": 256, 00:28:02.282 "data_size": 7936 00:28:02.282 }, 00:28:02.282 { 00:28:02.282 "name": "pt2", 00:28:02.282 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:02.282 "is_configured": true, 00:28:02.282 "data_offset": 256, 00:28:02.282 "data_size": 7936 00:28:02.282 } 00:28:02.282 ] 00:28:02.282 }' 00:28:02.282 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.282 13:46:41 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:02.861 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:03.118 [2024-07-15 13:46:42.320492] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:03.118 "name": "raid_bdev1", 00:28:03.118 "aliases": [ 00:28:03.118 "c757f804-2011-4e43-9d4c-2125b4bfebcc" 00:28:03.118 ], 00:28:03.118 "product_name": "Raid Volume", 00:28:03.118 "block_size": 4128, 00:28:03.118 "num_blocks": 7936, 00:28:03.118 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:03.118 "md_size": 32, 00:28:03.118 "md_interleave": true, 00:28:03.118 "dif_type": 0, 00:28:03.118 "assigned_rate_limits": { 00:28:03.118 "rw_ios_per_sec": 0, 00:28:03.118 "rw_mbytes_per_sec": 0, 00:28:03.118 "r_mbytes_per_sec": 0, 00:28:03.118 "w_mbytes_per_sec": 0 00:28:03.118 }, 00:28:03.118 "claimed": false, 00:28:03.118 "zoned": false, 00:28:03.118 "supported_io_types": { 00:28:03.118 "read": true, 00:28:03.118 "write": true, 00:28:03.118 "unmap": false, 00:28:03.118 "flush": false, 00:28:03.118 "reset": true, 00:28:03.118 "nvme_admin": false, 00:28:03.118 "nvme_io": false, 00:28:03.118 "nvme_io_md": false, 00:28:03.118 "write_zeroes": true, 00:28:03.118 "zcopy": false, 00:28:03.118 "get_zone_info": false, 00:28:03.118 "zone_management": false, 00:28:03.118 "zone_append": false, 00:28:03.118 "compare": false, 00:28:03.118 "compare_and_write": false, 00:28:03.118 "abort": false, 00:28:03.118 "seek_hole": false, 00:28:03.118 "seek_data": false, 00:28:03.118 "copy": false, 00:28:03.118 "nvme_iov_md": false 00:28:03.118 }, 00:28:03.118 "memory_domains": [ 00:28:03.118 { 00:28:03.118 "dma_device_id": "system", 00:28:03.118 "dma_device_type": 1 00:28:03.118 }, 00:28:03.118 { 00:28:03.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.118 "dma_device_type": 2 00:28:03.118 }, 00:28:03.118 { 00:28:03.118 "dma_device_id": "system", 00:28:03.118 "dma_device_type": 1 00:28:03.118 }, 00:28:03.118 { 00:28:03.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.118 "dma_device_type": 2 00:28:03.118 } 00:28:03.118 ], 00:28:03.118 "driver_specific": { 00:28:03.118 "raid": { 00:28:03.118 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:03.118 "strip_size_kb": 0, 00:28:03.118 "state": "online", 00:28:03.118 "raid_level": "raid1", 00:28:03.118 "superblock": true, 00:28:03.118 "num_base_bdevs": 2, 00:28:03.118 "num_base_bdevs_discovered": 2, 00:28:03.118 "num_base_bdevs_operational": 2, 00:28:03.118 "base_bdevs_list": [ 00:28:03.118 { 00:28:03.118 "name": "pt1", 00:28:03.118 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.118 "is_configured": true, 00:28:03.118 "data_offset": 256, 00:28:03.118 "data_size": 7936 00:28:03.118 }, 00:28:03.118 { 00:28:03.118 "name": "pt2", 00:28:03.118 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.118 "is_configured": true, 00:28:03.118 "data_offset": 256, 00:28:03.118 "data_size": 7936 00:28:03.118 } 00:28:03.118 ] 00:28:03.118 } 00:28:03.118 } 00:28:03.118 }' 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:03.118 pt2' 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:03.118 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:03.375 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:03.375 "name": "pt1", 00:28:03.375 "aliases": [ 00:28:03.375 "00000000-0000-0000-0000-000000000001" 00:28:03.375 ], 00:28:03.375 "product_name": "passthru", 00:28:03.375 "block_size": 4128, 00:28:03.375 "num_blocks": 8192, 00:28:03.375 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.375 "md_size": 32, 00:28:03.375 "md_interleave": true, 00:28:03.375 "dif_type": 0, 00:28:03.375 "assigned_rate_limits": { 00:28:03.375 "rw_ios_per_sec": 0, 00:28:03.375 "rw_mbytes_per_sec": 0, 00:28:03.375 "r_mbytes_per_sec": 0, 00:28:03.375 "w_mbytes_per_sec": 0 00:28:03.375 }, 00:28:03.375 "claimed": true, 00:28:03.375 "claim_type": "exclusive_write", 00:28:03.375 "zoned": false, 00:28:03.375 "supported_io_types": { 00:28:03.375 "read": true, 00:28:03.375 "write": true, 00:28:03.375 "unmap": true, 00:28:03.375 "flush": true, 00:28:03.375 "reset": true, 00:28:03.375 "nvme_admin": false, 00:28:03.375 "nvme_io": false, 00:28:03.375 "nvme_io_md": false, 00:28:03.375 "write_zeroes": true, 00:28:03.375 "zcopy": true, 00:28:03.375 "get_zone_info": false, 00:28:03.375 "zone_management": false, 00:28:03.375 "zone_append": false, 00:28:03.375 "compare": false, 00:28:03.375 "compare_and_write": false, 00:28:03.375 "abort": true, 00:28:03.375 "seek_hole": false, 00:28:03.375 "seek_data": false, 00:28:03.375 "copy": true, 00:28:03.375 "nvme_iov_md": false 00:28:03.375 }, 00:28:03.375 "memory_domains": [ 00:28:03.375 { 00:28:03.375 "dma_device_id": "system", 00:28:03.375 "dma_device_type": 1 00:28:03.375 }, 00:28:03.375 { 00:28:03.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.375 "dma_device_type": 2 00:28:03.375 } 00:28:03.375 ], 00:28:03.375 "driver_specific": { 00:28:03.375 "passthru": { 00:28:03.375 "name": "pt1", 00:28:03.375 "base_bdev_name": "malloc1" 00:28:03.375 } 00:28:03.375 } 00:28:03.375 }' 00:28:03.375 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:03.375 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:03.375 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:03.376 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:03.376 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:03.632 13:46:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:03.889 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:03.889 "name": "pt2", 00:28:03.889 "aliases": [ 00:28:03.889 "00000000-0000-0000-0000-000000000002" 00:28:03.889 ], 00:28:03.889 "product_name": "passthru", 00:28:03.889 "block_size": 4128, 00:28:03.889 "num_blocks": 8192, 00:28:03.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.889 "md_size": 32, 00:28:03.889 "md_interleave": true, 00:28:03.889 "dif_type": 0, 00:28:03.889 "assigned_rate_limits": { 00:28:03.889 "rw_ios_per_sec": 0, 00:28:03.889 "rw_mbytes_per_sec": 0, 00:28:03.889 "r_mbytes_per_sec": 0, 00:28:03.889 "w_mbytes_per_sec": 0 00:28:03.889 }, 00:28:03.889 "claimed": true, 00:28:03.889 "claim_type": "exclusive_write", 00:28:03.889 "zoned": false, 00:28:03.889 "supported_io_types": { 00:28:03.889 "read": true, 00:28:03.889 "write": true, 00:28:03.889 "unmap": true, 00:28:03.889 "flush": true, 00:28:03.889 "reset": true, 00:28:03.889 "nvme_admin": false, 00:28:03.889 "nvme_io": false, 00:28:03.889 "nvme_io_md": false, 00:28:03.889 "write_zeroes": true, 00:28:03.889 "zcopy": true, 00:28:03.889 "get_zone_info": false, 00:28:03.889 "zone_management": false, 00:28:03.889 "zone_append": false, 00:28:03.889 "compare": false, 00:28:03.889 "compare_and_write": false, 00:28:03.889 "abort": true, 00:28:03.889 "seek_hole": false, 00:28:03.889 "seek_data": false, 00:28:03.889 "copy": true, 00:28:03.889 "nvme_iov_md": false 00:28:03.889 }, 00:28:03.889 "memory_domains": [ 00:28:03.889 { 00:28:03.889 "dma_device_id": "system", 00:28:03.889 "dma_device_type": 1 00:28:03.889 }, 00:28:03.889 { 00:28:03.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.889 "dma_device_type": 2 00:28:03.889 } 00:28:03.889 ], 00:28:03.889 "driver_specific": { 00:28:03.889 "passthru": { 00:28:03.889 "name": "pt2", 00:28:03.889 "base_bdev_name": "malloc2" 00:28:03.889 } 00:28:03.889 } 00:28:03.889 }' 00:28:03.889 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:03.889 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.146 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:04.146 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.146 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.146 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:04.147 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.147 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.147 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:04.147 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.147 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.404 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:04.404 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:04.404 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:04.661 [2024-07-15 13:46:43.844562] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:04.661 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c757f804-2011-4e43-9d4c-2125b4bfebcc 00:28:04.661 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z c757f804-2011-4e43-9d4c-2125b4bfebcc ']' 00:28:04.661 13:46:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:04.919 [2024-07-15 13:46:44.088936] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:04.919 [2024-07-15 13:46:44.088960] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:04.919 [2024-07-15 13:46:44.089019] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:04.919 [2024-07-15 13:46:44.089078] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:04.919 [2024-07-15 13:46:44.089090] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x176af20 name raid_bdev1, state offline 00:28:04.919 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.919 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:05.176 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:05.434 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:05.434 13:46:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:05.692 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.258 [2024-07-15 13:46:45.560771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:06.258 [2024-07-15 13:46:45.562125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:06.258 [2024-07-15 13:46:45.562180] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:06.258 [2024-07-15 13:46:45.562220] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:06.258 [2024-07-15 13:46:45.562240] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:06.258 [2024-07-15 13:46:45.562250] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1775260 name raid_bdev1, state configuring 00:28:06.258 request: 00:28:06.258 { 00:28:06.258 "name": "raid_bdev1", 00:28:06.258 "raid_level": "raid1", 00:28:06.258 "base_bdevs": [ 00:28:06.258 "malloc1", 00:28:06.258 "malloc2" 00:28:06.258 ], 00:28:06.258 "superblock": false, 00:28:06.258 "method": "bdev_raid_create", 00:28:06.258 "req_id": 1 00:28:06.258 } 00:28:06.258 Got JSON-RPC error response 00:28:06.258 response: 00:28:06.258 { 00:28:06.258 "code": -17, 00:28:06.258 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:06.258 } 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.258 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:06.517 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:06.517 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:06.517 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:06.778 [2024-07-15 13:46:45.973809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:06.778 [2024-07-15 13:46:45.973859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:06.778 [2024-07-15 13:46:45.973878] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176c000 00:28:06.778 [2024-07-15 13:46:45.973890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:06.778 [2024-07-15 13:46:45.975348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:06.778 [2024-07-15 13:46:45.975376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:06.778 [2024-07-15 13:46:45.975426] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:06.778 [2024-07-15 13:46:45.975453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:06.778 pt1 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.778 13:46:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.778 13:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.778 "name": "raid_bdev1", 00:28:06.778 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:06.778 "strip_size_kb": 0, 00:28:06.778 "state": "configuring", 00:28:06.778 "raid_level": "raid1", 00:28:06.778 "superblock": true, 00:28:06.778 "num_base_bdevs": 2, 00:28:06.778 "num_base_bdevs_discovered": 1, 00:28:06.778 "num_base_bdevs_operational": 2, 00:28:06.778 "base_bdevs_list": [ 00:28:06.778 { 00:28:06.778 "name": "pt1", 00:28:06.778 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:06.778 "is_configured": true, 00:28:06.778 "data_offset": 256, 00:28:06.778 "data_size": 7936 00:28:06.778 }, 00:28:06.778 { 00:28:06.778 "name": null, 00:28:06.778 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:06.778 "is_configured": false, 00:28:06.778 "data_offset": 256, 00:28:06.778 "data_size": 7936 00:28:06.778 } 00:28:06.778 ] 00:28:06.778 }' 00:28:06.778 13:46:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.778 13:46:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:07.750 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:07.750 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:07.750 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:07.750 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:08.009 [2024-07-15 13:46:47.185046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:08.009 [2024-07-15 13:46:47.185098] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:08.009 [2024-07-15 13:46:47.185120] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176e270 00:28:08.009 [2024-07-15 13:46:47.185132] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:08.009 [2024-07-15 13:46:47.185302] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:08.009 [2024-07-15 13:46:47.185319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:08.009 [2024-07-15 13:46:47.185365] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:08.009 [2024-07-15 13:46:47.185385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:08.009 [2024-07-15 13:46:47.185467] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15e7c10 00:28:08.009 [2024-07-15 13:46:47.185477] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:08.009 [2024-07-15 13:46:47.185532] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1769d40 00:28:08.009 [2024-07-15 13:46:47.185606] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15e7c10 00:28:08.009 [2024-07-15 13:46:47.185615] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15e7c10 00:28:08.009 [2024-07-15 13:46:47.185674] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.009 pt2 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.009 "name": "raid_bdev1", 00:28:08.009 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:08.009 "strip_size_kb": 0, 00:28:08.009 "state": "online", 00:28:08.009 "raid_level": "raid1", 00:28:08.009 "superblock": true, 00:28:08.009 "num_base_bdevs": 2, 00:28:08.009 "num_base_bdevs_discovered": 2, 00:28:08.009 "num_base_bdevs_operational": 2, 00:28:08.009 "base_bdevs_list": [ 00:28:08.009 { 00:28:08.009 "name": "pt1", 00:28:08.009 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:08.009 "is_configured": true, 00:28:08.009 "data_offset": 256, 00:28:08.009 "data_size": 7936 00:28:08.009 }, 00:28:08.009 { 00:28:08.009 "name": "pt2", 00:28:08.009 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:08.009 "is_configured": true, 00:28:08.009 "data_offset": 256, 00:28:08.009 "data_size": 7936 00:28:08.009 } 00:28:08.009 ] 00:28:08.009 }' 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.009 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:08.576 13:46:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:08.834 [2024-07-15 13:46:48.187968] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:08.834 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:08.834 "name": "raid_bdev1", 00:28:08.834 "aliases": [ 00:28:08.834 "c757f804-2011-4e43-9d4c-2125b4bfebcc" 00:28:08.834 ], 00:28:08.834 "product_name": "Raid Volume", 00:28:08.834 "block_size": 4128, 00:28:08.834 "num_blocks": 7936, 00:28:08.834 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:08.834 "md_size": 32, 00:28:08.834 "md_interleave": true, 00:28:08.834 "dif_type": 0, 00:28:08.834 "assigned_rate_limits": { 00:28:08.834 "rw_ios_per_sec": 0, 00:28:08.834 "rw_mbytes_per_sec": 0, 00:28:08.834 "r_mbytes_per_sec": 0, 00:28:08.834 "w_mbytes_per_sec": 0 00:28:08.834 }, 00:28:08.834 "claimed": false, 00:28:08.834 "zoned": false, 00:28:08.834 "supported_io_types": { 00:28:08.834 "read": true, 00:28:08.834 "write": true, 00:28:08.834 "unmap": false, 00:28:08.834 "flush": false, 00:28:08.834 "reset": true, 00:28:08.834 "nvme_admin": false, 00:28:08.834 "nvme_io": false, 00:28:08.834 "nvme_io_md": false, 00:28:08.834 "write_zeroes": true, 00:28:08.834 "zcopy": false, 00:28:08.834 "get_zone_info": false, 00:28:08.834 "zone_management": false, 00:28:08.834 "zone_append": false, 00:28:08.834 "compare": false, 00:28:08.834 "compare_and_write": false, 00:28:08.834 "abort": false, 00:28:08.834 "seek_hole": false, 00:28:08.834 "seek_data": false, 00:28:08.834 "copy": false, 00:28:08.834 "nvme_iov_md": false 00:28:08.834 }, 00:28:08.834 "memory_domains": [ 00:28:08.834 { 00:28:08.834 "dma_device_id": "system", 00:28:08.834 "dma_device_type": 1 00:28:08.834 }, 00:28:08.834 { 00:28:08.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.834 "dma_device_type": 2 00:28:08.834 }, 00:28:08.834 { 00:28:08.834 "dma_device_id": "system", 00:28:08.834 "dma_device_type": 1 00:28:08.834 }, 00:28:08.834 { 00:28:08.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.834 "dma_device_type": 2 00:28:08.834 } 00:28:08.834 ], 00:28:08.834 "driver_specific": { 00:28:08.834 "raid": { 00:28:08.834 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:08.834 "strip_size_kb": 0, 00:28:08.834 "state": "online", 00:28:08.834 "raid_level": "raid1", 00:28:08.834 "superblock": true, 00:28:08.834 "num_base_bdevs": 2, 00:28:08.834 "num_base_bdevs_discovered": 2, 00:28:08.834 "num_base_bdevs_operational": 2, 00:28:08.834 "base_bdevs_list": [ 00:28:08.834 { 00:28:08.834 "name": "pt1", 00:28:08.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:08.834 "is_configured": true, 00:28:08.834 "data_offset": 256, 00:28:08.834 "data_size": 7936 00:28:08.834 }, 00:28:08.834 { 00:28:08.834 "name": "pt2", 00:28:08.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:08.834 "is_configured": true, 00:28:08.834 "data_offset": 256, 00:28:08.834 "data_size": 7936 00:28:08.834 } 00:28:08.834 ] 00:28:08.834 } 00:28:08.834 } 00:28:08.834 }' 00:28:08.834 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:08.834 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:08.834 pt2' 00:28:08.834 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:08.834 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:09.091 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:09.091 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:09.091 "name": "pt1", 00:28:09.091 "aliases": [ 00:28:09.091 "00000000-0000-0000-0000-000000000001" 00:28:09.091 ], 00:28:09.091 "product_name": "passthru", 00:28:09.091 "block_size": 4128, 00:28:09.091 "num_blocks": 8192, 00:28:09.091 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:09.091 "md_size": 32, 00:28:09.091 "md_interleave": true, 00:28:09.091 "dif_type": 0, 00:28:09.091 "assigned_rate_limits": { 00:28:09.091 "rw_ios_per_sec": 0, 00:28:09.091 "rw_mbytes_per_sec": 0, 00:28:09.091 "r_mbytes_per_sec": 0, 00:28:09.091 "w_mbytes_per_sec": 0 00:28:09.091 }, 00:28:09.091 "claimed": true, 00:28:09.091 "claim_type": "exclusive_write", 00:28:09.091 "zoned": false, 00:28:09.091 "supported_io_types": { 00:28:09.091 "read": true, 00:28:09.091 "write": true, 00:28:09.091 "unmap": true, 00:28:09.091 "flush": true, 00:28:09.091 "reset": true, 00:28:09.091 "nvme_admin": false, 00:28:09.091 "nvme_io": false, 00:28:09.091 "nvme_io_md": false, 00:28:09.091 "write_zeroes": true, 00:28:09.091 "zcopy": true, 00:28:09.091 "get_zone_info": false, 00:28:09.091 "zone_management": false, 00:28:09.091 "zone_append": false, 00:28:09.091 "compare": false, 00:28:09.091 "compare_and_write": false, 00:28:09.091 "abort": true, 00:28:09.091 "seek_hole": false, 00:28:09.092 "seek_data": false, 00:28:09.092 "copy": true, 00:28:09.092 "nvme_iov_md": false 00:28:09.092 }, 00:28:09.092 "memory_domains": [ 00:28:09.092 { 00:28:09.092 "dma_device_id": "system", 00:28:09.092 "dma_device_type": 1 00:28:09.092 }, 00:28:09.092 { 00:28:09.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.092 "dma_device_type": 2 00:28:09.092 } 00:28:09.092 ], 00:28:09.092 "driver_specific": { 00:28:09.092 "passthru": { 00:28:09.092 "name": "pt1", 00:28:09.092 "base_bdev_name": "malloc1" 00:28:09.092 } 00:28:09.092 } 00:28:09.092 }' 00:28:09.092 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:09.350 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.608 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.608 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:09.608 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:09.608 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:09.608 13:46:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:09.867 "name": "pt2", 00:28:09.867 "aliases": [ 00:28:09.867 "00000000-0000-0000-0000-000000000002" 00:28:09.867 ], 00:28:09.867 "product_name": "passthru", 00:28:09.867 "block_size": 4128, 00:28:09.867 "num_blocks": 8192, 00:28:09.867 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:09.867 "md_size": 32, 00:28:09.867 "md_interleave": true, 00:28:09.867 "dif_type": 0, 00:28:09.867 "assigned_rate_limits": { 00:28:09.867 "rw_ios_per_sec": 0, 00:28:09.867 "rw_mbytes_per_sec": 0, 00:28:09.867 "r_mbytes_per_sec": 0, 00:28:09.867 "w_mbytes_per_sec": 0 00:28:09.867 }, 00:28:09.867 "claimed": true, 00:28:09.867 "claim_type": "exclusive_write", 00:28:09.867 "zoned": false, 00:28:09.867 "supported_io_types": { 00:28:09.867 "read": true, 00:28:09.867 "write": true, 00:28:09.867 "unmap": true, 00:28:09.867 "flush": true, 00:28:09.867 "reset": true, 00:28:09.867 "nvme_admin": false, 00:28:09.867 "nvme_io": false, 00:28:09.867 "nvme_io_md": false, 00:28:09.867 "write_zeroes": true, 00:28:09.867 "zcopy": true, 00:28:09.867 "get_zone_info": false, 00:28:09.867 "zone_management": false, 00:28:09.867 "zone_append": false, 00:28:09.867 "compare": false, 00:28:09.867 "compare_and_write": false, 00:28:09.867 "abort": true, 00:28:09.867 "seek_hole": false, 00:28:09.867 "seek_data": false, 00:28:09.867 "copy": true, 00:28:09.867 "nvme_iov_md": false 00:28:09.867 }, 00:28:09.867 "memory_domains": [ 00:28:09.867 { 00:28:09.867 "dma_device_id": "system", 00:28:09.867 "dma_device_type": 1 00:28:09.867 }, 00:28:09.867 { 00:28:09.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.867 "dma_device_type": 2 00:28:09.867 } 00:28:09.867 ], 00:28:09.867 "driver_specific": { 00:28:09.867 "passthru": { 00:28:09.867 "name": "pt2", 00:28:09.867 "base_bdev_name": "malloc2" 00:28:09.867 } 00:28:09.867 } 00:28:09.867 }' 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:09.867 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.868 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:10.126 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:10.384 [2024-07-15 13:46:49.639825] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:10.384 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' c757f804-2011-4e43-9d4c-2125b4bfebcc '!=' c757f804-2011-4e43-9d4c-2125b4bfebcc ']' 00:28:10.384 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:10.384 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:10.384 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:10.384 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:10.642 [2024-07-15 13:46:49.888258] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.642 13:46:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.901 13:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:10.901 "name": "raid_bdev1", 00:28:10.901 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:10.901 "strip_size_kb": 0, 00:28:10.901 "state": "online", 00:28:10.901 "raid_level": "raid1", 00:28:10.901 "superblock": true, 00:28:10.901 "num_base_bdevs": 2, 00:28:10.901 "num_base_bdevs_discovered": 1, 00:28:10.901 "num_base_bdevs_operational": 1, 00:28:10.901 "base_bdevs_list": [ 00:28:10.901 { 00:28:10.901 "name": null, 00:28:10.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.901 "is_configured": false, 00:28:10.901 "data_offset": 256, 00:28:10.901 "data_size": 7936 00:28:10.901 }, 00:28:10.901 { 00:28:10.901 "name": "pt2", 00:28:10.901 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:10.901 "is_configured": true, 00:28:10.901 "data_offset": 256, 00:28:10.901 "data_size": 7936 00:28:10.901 } 00:28:10.901 ] 00:28:10.901 }' 00:28:10.901 13:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:10.901 13:46:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:11.835 13:46:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:11.835 [2024-07-15 13:46:51.195706] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:11.835 [2024-07-15 13:46:51.195733] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:11.835 [2024-07-15 13:46:51.195789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:11.835 [2024-07-15 13:46:51.195837] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:11.835 [2024-07-15 13:46:51.195848] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15e7c10 name raid_bdev1, state offline 00:28:11.835 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.835 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:12.093 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:12.093 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:12.093 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:12.093 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:12.093 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:28:12.351 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:12.611 [2024-07-15 13:46:51.929621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:12.611 [2024-07-15 13:46:51.929672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:12.611 [2024-07-15 13:46:51.929693] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176c9f0 00:28:12.611 [2024-07-15 13:46:51.929705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:12.611 [2024-07-15 13:46:51.931128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:12.611 [2024-07-15 13:46:51.931155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:12.611 [2024-07-15 13:46:51.931205] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:12.611 [2024-07-15 13:46:51.931231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:12.611 [2024-07-15 13:46:51.931304] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x176dea0 00:28:12.611 [2024-07-15 13:46:51.931314] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:12.611 [2024-07-15 13:46:51.931372] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x176bbc0 00:28:12.611 [2024-07-15 13:46:51.931445] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x176dea0 00:28:12.611 [2024-07-15 13:46:51.931454] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x176dea0 00:28:12.611 [2024-07-15 13:46:51.931511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:12.611 pt2 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.611 13:46:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.870 13:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.870 "name": "raid_bdev1", 00:28:12.870 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:12.870 "strip_size_kb": 0, 00:28:12.870 "state": "online", 00:28:12.870 "raid_level": "raid1", 00:28:12.870 "superblock": true, 00:28:12.870 "num_base_bdevs": 2, 00:28:12.870 "num_base_bdevs_discovered": 1, 00:28:12.870 "num_base_bdevs_operational": 1, 00:28:12.870 "base_bdevs_list": [ 00:28:12.870 { 00:28:12.870 "name": null, 00:28:12.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.870 "is_configured": false, 00:28:12.870 "data_offset": 256, 00:28:12.870 "data_size": 7936 00:28:12.870 }, 00:28:12.870 { 00:28:12.870 "name": "pt2", 00:28:12.870 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:12.870 "is_configured": true, 00:28:12.870 "data_offset": 256, 00:28:12.870 "data_size": 7936 00:28:12.870 } 00:28:12.870 ] 00:28:12.870 }' 00:28:12.870 13:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.870 13:46:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:13.438 13:46:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:13.697 [2024-07-15 13:46:53.016500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:13.697 [2024-07-15 13:46:53.016528] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:13.697 [2024-07-15 13:46:53.016581] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:13.697 [2024-07-15 13:46:53.016626] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:13.697 [2024-07-15 13:46:53.016638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x176dea0 name raid_bdev1, state offline 00:28:13.697 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.697 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:13.954 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:13.954 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:13.954 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:13.954 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:14.213 [2024-07-15 13:46:53.513804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:14.213 [2024-07-15 13:46:53.513854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:14.213 [2024-07-15 13:46:53.513875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176c620 00:28:14.213 [2024-07-15 13:46:53.513888] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:14.213 [2024-07-15 13:46:53.515330] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:14.213 [2024-07-15 13:46:53.515357] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:14.213 [2024-07-15 13:46:53.515405] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:14.213 [2024-07-15 13:46:53.515432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:14.213 [2024-07-15 13:46:53.515515] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:14.213 [2024-07-15 13:46:53.515534] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.213 [2024-07-15 13:46:53.515550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x176e640 name raid_bdev1, state configuring 00:28:14.213 [2024-07-15 13:46:53.515573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:14.213 [2024-07-15 13:46:53.515627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x176e640 00:28:14.213 [2024-07-15 13:46:53.515637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:14.213 [2024-07-15 13:46:53.515692] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x176d810 00:28:14.213 [2024-07-15 13:46:53.515764] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x176e640 00:28:14.213 [2024-07-15 13:46:53.515773] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x176e640 00:28:14.213 [2024-07-15 13:46:53.515831] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:14.213 pt1 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.213 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.472 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.472 "name": "raid_bdev1", 00:28:14.472 "uuid": "c757f804-2011-4e43-9d4c-2125b4bfebcc", 00:28:14.472 "strip_size_kb": 0, 00:28:14.472 "state": "online", 00:28:14.472 "raid_level": "raid1", 00:28:14.472 "superblock": true, 00:28:14.472 "num_base_bdevs": 2, 00:28:14.472 "num_base_bdevs_discovered": 1, 00:28:14.472 "num_base_bdevs_operational": 1, 00:28:14.472 "base_bdevs_list": [ 00:28:14.472 { 00:28:14.472 "name": null, 00:28:14.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.472 "is_configured": false, 00:28:14.472 "data_offset": 256, 00:28:14.472 "data_size": 7936 00:28:14.472 }, 00:28:14.472 { 00:28:14.472 "name": "pt2", 00:28:14.472 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:14.472 "is_configured": true, 00:28:14.472 "data_offset": 256, 00:28:14.472 "data_size": 7936 00:28:14.472 } 00:28:14.472 ] 00:28:14.472 }' 00:28:14.472 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.472 13:46:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:15.038 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:15.038 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:15.296 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:15.296 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:15.296 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:15.555 [2024-07-15 13:46:54.833522] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' c757f804-2011-4e43-9d4c-2125b4bfebcc '!=' c757f804-2011-4e43-9d4c-2125b4bfebcc ']' 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2225116 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2225116 ']' 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2225116 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:15.555 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2225116 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2225116' 00:28:15.556 killing process with pid 2225116 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2225116 00:28:15.556 [2024-07-15 13:46:54.905858] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:15.556 [2024-07-15 13:46:54.905916] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:15.556 [2024-07-15 13:46:54.905969] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:15.556 [2024-07-15 13:46:54.905982] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x176e640 name raid_bdev1, state offline 00:28:15.556 13:46:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2225116 00:28:15.556 [2024-07-15 13:46:54.924331] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:15.815 13:46:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:28:15.815 00:28:15.815 real 0m15.863s 00:28:15.815 user 0m28.825s 00:28:15.815 sys 0m2.871s 00:28:15.815 13:46:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:15.815 13:46:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:15.815 ************************************ 00:28:15.815 END TEST raid_superblock_test_md_interleaved 00:28:15.815 ************************************ 00:28:15.815 13:46:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:15.815 13:46:55 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:28:15.815 13:46:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:15.815 13:46:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:15.815 13:46:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:15.815 ************************************ 00:28:15.815 START TEST raid_rebuild_test_sb_md_interleaved 00:28:15.815 ************************************ 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:15.815 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2227397 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2227397 /var/tmp/spdk-raid.sock 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2227397 ']' 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:16.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:16.097 13:46:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:16.097 [2024-07-15 13:46:55.299787] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:16.097 [2024-07-15 13:46:55.299851] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2227397 ] 00:28:16.097 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:16.097 Zero copy mechanism will not be used. 00:28:16.097 [2024-07-15 13:46:55.428306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:16.356 [2024-07-15 13:46:55.536502] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:16.356 [2024-07-15 13:46:55.599357] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:16.356 [2024-07-15 13:46:55.599388] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:16.920 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:16.920 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:16.920 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:16.920 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:28:17.179 BaseBdev1_malloc 00:28:17.179 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:17.437 [2024-07-15 13:46:56.732464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:17.437 [2024-07-15 13:46:56.732511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:17.437 [2024-07-15 13:46:56.732535] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1abbce0 00:28:17.437 [2024-07-15 13:46:56.732548] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:17.437 [2024-07-15 13:46:56.734047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:17.437 [2024-07-15 13:46:56.734073] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:17.437 BaseBdev1 00:28:17.437 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:17.437 13:46:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:28:17.715 BaseBdev2_malloc 00:28:17.715 13:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:17.972 [2024-07-15 13:46:57.251021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:17.972 [2024-07-15 13:46:57.251067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:17.972 [2024-07-15 13:46:57.251091] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab32d0 00:28:17.972 [2024-07-15 13:46:57.251103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:17.972 [2024-07-15 13:46:57.252875] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:17.972 [2024-07-15 13:46:57.252903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:17.972 BaseBdev2 00:28:17.972 13:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:28:18.229 spare_malloc 00:28:18.229 13:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:18.487 spare_delay 00:28:18.487 13:46:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:18.765 [2024-07-15 13:46:57.987080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:18.765 [2024-07-15 13:46:57.987125] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.765 [2024-07-15 13:46:57.987148] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab6070 00:28:18.765 [2024-07-15 13:46:57.987160] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.765 [2024-07-15 13:46:57.988591] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.765 [2024-07-15 13:46:57.988619] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:18.765 spare 00:28:18.765 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:19.023 [2024-07-15 13:46:58.227745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:19.023 [2024-07-15 13:46:58.229059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:19.023 [2024-07-15 13:46:58.229229] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ab8370 00:28:19.023 [2024-07-15 13:46:58.229243] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:19.023 [2024-07-15 13:46:58.229314] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x191e9c0 00:28:19.023 [2024-07-15 13:46:58.229399] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ab8370 00:28:19.023 [2024-07-15 13:46:58.229409] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ab8370 00:28:19.023 [2024-07-15 13:46:58.229465] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.023 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.280 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.280 "name": "raid_bdev1", 00:28:19.280 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:19.280 "strip_size_kb": 0, 00:28:19.280 "state": "online", 00:28:19.280 "raid_level": "raid1", 00:28:19.280 "superblock": true, 00:28:19.280 "num_base_bdevs": 2, 00:28:19.280 "num_base_bdevs_discovered": 2, 00:28:19.280 "num_base_bdevs_operational": 2, 00:28:19.280 "base_bdevs_list": [ 00:28:19.280 { 00:28:19.280 "name": "BaseBdev1", 00:28:19.280 "uuid": "3584c2f4-df45-50e7-9aa4-836452bd48c9", 00:28:19.280 "is_configured": true, 00:28:19.280 "data_offset": 256, 00:28:19.280 "data_size": 7936 00:28:19.280 }, 00:28:19.280 { 00:28:19.280 "name": "BaseBdev2", 00:28:19.280 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:19.280 "is_configured": true, 00:28:19.280 "data_offset": 256, 00:28:19.280 "data_size": 7936 00:28:19.280 } 00:28:19.280 ] 00:28:19.280 }' 00:28:19.280 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.280 13:46:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:19.844 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:19.844 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:20.102 [2024-07-15 13:46:59.330892] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:20.102 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:20.102 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.102 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:20.359 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:20.359 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:20.359 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:28:20.359 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:20.617 [2024-07-15 13:46:59.823951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.617 13:46:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.874 13:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.874 "name": "raid_bdev1", 00:28:20.874 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:20.874 "strip_size_kb": 0, 00:28:20.874 "state": "online", 00:28:20.874 "raid_level": "raid1", 00:28:20.874 "superblock": true, 00:28:20.874 "num_base_bdevs": 2, 00:28:20.874 "num_base_bdevs_discovered": 1, 00:28:20.874 "num_base_bdevs_operational": 1, 00:28:20.874 "base_bdevs_list": [ 00:28:20.874 { 00:28:20.874 "name": null, 00:28:20.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.874 "is_configured": false, 00:28:20.874 "data_offset": 256, 00:28:20.874 "data_size": 7936 00:28:20.874 }, 00:28:20.874 { 00:28:20.874 "name": "BaseBdev2", 00:28:20.874 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:20.874 "is_configured": true, 00:28:20.874 "data_offset": 256, 00:28:20.874 "data_size": 7936 00:28:20.874 } 00:28:20.874 ] 00:28:20.874 }' 00:28:20.874 13:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.874 13:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:21.492 13:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:21.751 [2024-07-15 13:47:00.914889] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:21.752 [2024-07-15 13:47:00.918914] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab8250 00:28:21.752 [2024-07-15 13:47:00.920982] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:21.752 13:47:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.685 13:47:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.943 "name": "raid_bdev1", 00:28:22.943 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:22.943 "strip_size_kb": 0, 00:28:22.943 "state": "online", 00:28:22.943 "raid_level": "raid1", 00:28:22.943 "superblock": true, 00:28:22.943 "num_base_bdevs": 2, 00:28:22.943 "num_base_bdevs_discovered": 2, 00:28:22.943 "num_base_bdevs_operational": 2, 00:28:22.943 "process": { 00:28:22.943 "type": "rebuild", 00:28:22.943 "target": "spare", 00:28:22.943 "progress": { 00:28:22.943 "blocks": 3072, 00:28:22.943 "percent": 38 00:28:22.943 } 00:28:22.943 }, 00:28:22.943 "base_bdevs_list": [ 00:28:22.943 { 00:28:22.943 "name": "spare", 00:28:22.943 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:22.943 "is_configured": true, 00:28:22.943 "data_offset": 256, 00:28:22.943 "data_size": 7936 00:28:22.943 }, 00:28:22.943 { 00:28:22.943 "name": "BaseBdev2", 00:28:22.943 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:22.943 "is_configured": true, 00:28:22.943 "data_offset": 256, 00:28:22.943 "data_size": 7936 00:28:22.943 } 00:28:22.943 ] 00:28:22.943 }' 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:22.943 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:23.201 [2024-07-15 13:47:02.504257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:23.201 [2024-07-15 13:47:02.533682] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:23.201 [2024-07-15 13:47:02.533730] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:23.201 [2024-07-15 13:47:02.533746] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:23.201 [2024-07-15 13:47:02.533755] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.201 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.458 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.458 "name": "raid_bdev1", 00:28:23.458 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:23.458 "strip_size_kb": 0, 00:28:23.458 "state": "online", 00:28:23.458 "raid_level": "raid1", 00:28:23.458 "superblock": true, 00:28:23.458 "num_base_bdevs": 2, 00:28:23.458 "num_base_bdevs_discovered": 1, 00:28:23.458 "num_base_bdevs_operational": 1, 00:28:23.458 "base_bdevs_list": [ 00:28:23.458 { 00:28:23.458 "name": null, 00:28:23.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.458 "is_configured": false, 00:28:23.458 "data_offset": 256, 00:28:23.458 "data_size": 7936 00:28:23.458 }, 00:28:23.458 { 00:28:23.458 "name": "BaseBdev2", 00:28:23.459 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:23.459 "is_configured": true, 00:28:23.459 "data_offset": 256, 00:28:23.459 "data_size": 7936 00:28:23.459 } 00:28:23.459 ] 00:28:23.459 }' 00:28:23.459 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.459 13:47:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.025 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.284 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:24.284 "name": "raid_bdev1", 00:28:24.284 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:24.284 "strip_size_kb": 0, 00:28:24.284 "state": "online", 00:28:24.284 "raid_level": "raid1", 00:28:24.284 "superblock": true, 00:28:24.284 "num_base_bdevs": 2, 00:28:24.284 "num_base_bdevs_discovered": 1, 00:28:24.284 "num_base_bdevs_operational": 1, 00:28:24.284 "base_bdevs_list": [ 00:28:24.284 { 00:28:24.284 "name": null, 00:28:24.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.284 "is_configured": false, 00:28:24.284 "data_offset": 256, 00:28:24.284 "data_size": 7936 00:28:24.284 }, 00:28:24.284 { 00:28:24.284 "name": "BaseBdev2", 00:28:24.284 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:24.284 "is_configured": true, 00:28:24.284 "data_offset": 256, 00:28:24.284 "data_size": 7936 00:28:24.284 } 00:28:24.284 ] 00:28:24.284 }' 00:28:24.284 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:24.284 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:24.284 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:24.542 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:24.542 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:24.799 [2024-07-15 13:47:03.969315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:24.799 [2024-07-15 13:47:03.972924] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab4270 00:28:24.799 [2024-07-15 13:47:03.974383] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:24.799 13:47:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.731 13:47:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.988 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:25.988 "name": "raid_bdev1", 00:28:25.988 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:25.988 "strip_size_kb": 0, 00:28:25.988 "state": "online", 00:28:25.988 "raid_level": "raid1", 00:28:25.988 "superblock": true, 00:28:25.988 "num_base_bdevs": 2, 00:28:25.988 "num_base_bdevs_discovered": 2, 00:28:25.988 "num_base_bdevs_operational": 2, 00:28:25.988 "process": { 00:28:25.988 "type": "rebuild", 00:28:25.988 "target": "spare", 00:28:25.989 "progress": { 00:28:25.989 "blocks": 3072, 00:28:25.989 "percent": 38 00:28:25.989 } 00:28:25.989 }, 00:28:25.989 "base_bdevs_list": [ 00:28:25.989 { 00:28:25.989 "name": "spare", 00:28:25.989 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:25.989 "is_configured": true, 00:28:25.989 "data_offset": 256, 00:28:25.989 "data_size": 7936 00:28:25.989 }, 00:28:25.989 { 00:28:25.989 "name": "BaseBdev2", 00:28:25.989 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:25.989 "is_configured": true, 00:28:25.989 "data_offset": 256, 00:28:25.989 "data_size": 7936 00:28:25.989 } 00:28:25.989 ] 00:28:25.989 }' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:25.989 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1129 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.989 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.247 "name": "raid_bdev1", 00:28:26.247 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:26.247 "strip_size_kb": 0, 00:28:26.247 "state": "online", 00:28:26.247 "raid_level": "raid1", 00:28:26.247 "superblock": true, 00:28:26.247 "num_base_bdevs": 2, 00:28:26.247 "num_base_bdevs_discovered": 2, 00:28:26.247 "num_base_bdevs_operational": 2, 00:28:26.247 "process": { 00:28:26.247 "type": "rebuild", 00:28:26.247 "target": "spare", 00:28:26.247 "progress": { 00:28:26.247 "blocks": 3840, 00:28:26.247 "percent": 48 00:28:26.247 } 00:28:26.247 }, 00:28:26.247 "base_bdevs_list": [ 00:28:26.247 { 00:28:26.247 "name": "spare", 00:28:26.247 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:26.247 "is_configured": true, 00:28:26.247 "data_offset": 256, 00:28:26.247 "data_size": 7936 00:28:26.247 }, 00:28:26.247 { 00:28:26.247 "name": "BaseBdev2", 00:28:26.247 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:26.247 "is_configured": true, 00:28:26.247 "data_offset": 256, 00:28:26.247 "data_size": 7936 00:28:26.247 } 00:28:26.247 ] 00:28:26.247 }' 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:26.247 13:47:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:27.623 "name": "raid_bdev1", 00:28:27.623 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:27.623 "strip_size_kb": 0, 00:28:27.623 "state": "online", 00:28:27.623 "raid_level": "raid1", 00:28:27.623 "superblock": true, 00:28:27.623 "num_base_bdevs": 2, 00:28:27.623 "num_base_bdevs_discovered": 2, 00:28:27.623 "num_base_bdevs_operational": 2, 00:28:27.623 "process": { 00:28:27.623 "type": "rebuild", 00:28:27.623 "target": "spare", 00:28:27.623 "progress": { 00:28:27.623 "blocks": 7168, 00:28:27.623 "percent": 90 00:28:27.623 } 00:28:27.623 }, 00:28:27.623 "base_bdevs_list": [ 00:28:27.623 { 00:28:27.623 "name": "spare", 00:28:27.623 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:27.623 "is_configured": true, 00:28:27.623 "data_offset": 256, 00:28:27.623 "data_size": 7936 00:28:27.623 }, 00:28:27.623 { 00:28:27.623 "name": "BaseBdev2", 00:28:27.623 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:27.623 "is_configured": true, 00:28:27.623 "data_offset": 256, 00:28:27.623 "data_size": 7936 00:28:27.623 } 00:28:27.623 ] 00:28:27.623 }' 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:27.623 13:47:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:27.623 13:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:27.623 13:47:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:27.881 [2024-07-15 13:47:07.098577] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:27.881 [2024-07-15 13:47:07.098636] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:27.881 [2024-07-15 13:47:07.098722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.815 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.074 "name": "raid_bdev1", 00:28:29.074 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:29.074 "strip_size_kb": 0, 00:28:29.074 "state": "online", 00:28:29.074 "raid_level": "raid1", 00:28:29.074 "superblock": true, 00:28:29.074 "num_base_bdevs": 2, 00:28:29.074 "num_base_bdevs_discovered": 2, 00:28:29.074 "num_base_bdevs_operational": 2, 00:28:29.074 "base_bdevs_list": [ 00:28:29.074 { 00:28:29.074 "name": "spare", 00:28:29.074 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:29.074 "is_configured": true, 00:28:29.074 "data_offset": 256, 00:28:29.074 "data_size": 7936 00:28:29.074 }, 00:28:29.074 { 00:28:29.074 "name": "BaseBdev2", 00:28:29.074 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:29.074 "is_configured": true, 00:28:29.074 "data_offset": 256, 00:28:29.074 "data_size": 7936 00:28:29.074 } 00:28:29.074 ] 00:28:29.074 }' 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.074 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.333 "name": "raid_bdev1", 00:28:29.333 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:29.333 "strip_size_kb": 0, 00:28:29.333 "state": "online", 00:28:29.333 "raid_level": "raid1", 00:28:29.333 "superblock": true, 00:28:29.333 "num_base_bdevs": 2, 00:28:29.333 "num_base_bdevs_discovered": 2, 00:28:29.333 "num_base_bdevs_operational": 2, 00:28:29.333 "base_bdevs_list": [ 00:28:29.333 { 00:28:29.333 "name": "spare", 00:28:29.333 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:29.333 "is_configured": true, 00:28:29.333 "data_offset": 256, 00:28:29.333 "data_size": 7936 00:28:29.333 }, 00:28:29.333 { 00:28:29.333 "name": "BaseBdev2", 00:28:29.333 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:29.333 "is_configured": true, 00:28:29.333 "data_offset": 256, 00:28:29.333 "data_size": 7936 00:28:29.333 } 00:28:29.333 ] 00:28:29.333 }' 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.333 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.592 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:29.592 "name": "raid_bdev1", 00:28:29.592 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:29.592 "strip_size_kb": 0, 00:28:29.592 "state": "online", 00:28:29.592 "raid_level": "raid1", 00:28:29.592 "superblock": true, 00:28:29.592 "num_base_bdevs": 2, 00:28:29.592 "num_base_bdevs_discovered": 2, 00:28:29.592 "num_base_bdevs_operational": 2, 00:28:29.592 "base_bdevs_list": [ 00:28:29.592 { 00:28:29.592 "name": "spare", 00:28:29.592 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:29.592 "is_configured": true, 00:28:29.592 "data_offset": 256, 00:28:29.592 "data_size": 7936 00:28:29.592 }, 00:28:29.592 { 00:28:29.592 "name": "BaseBdev2", 00:28:29.592 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:29.592 "is_configured": true, 00:28:29.592 "data_offset": 256, 00:28:29.592 "data_size": 7936 00:28:29.592 } 00:28:29.592 ] 00:28:29.592 }' 00:28:29.592 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:29.592 13:47:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:30.182 13:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:30.442 [2024-07-15 13:47:09.738187] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:30.442 [2024-07-15 13:47:09.738214] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:30.442 [2024-07-15 13:47:09.738271] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:30.442 [2024-07-15 13:47:09.738327] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:30.442 [2024-07-15 13:47:09.738338] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ab8370 name raid_bdev1, state offline 00:28:30.442 13:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.442 13:47:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:28:30.700 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:30.700 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:28:30.700 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:30.700 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:30.959 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:31.219 [2024-07-15 13:47:10.476108] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:31.219 [2024-07-15 13:47:10.476157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:31.219 [2024-07-15 13:47:10.476179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab67e0 00:28:31.219 [2024-07-15 13:47:10.476191] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:31.219 [2024-07-15 13:47:10.477694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:31.219 [2024-07-15 13:47:10.477721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:31.219 [2024-07-15 13:47:10.477781] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:31.219 [2024-07-15 13:47:10.477807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:31.219 [2024-07-15 13:47:10.477896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:31.219 spare 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.219 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.219 [2024-07-15 13:47:10.578212] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ab7270 00:28:31.219 [2024-07-15 13:47:10.578231] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:31.219 [2024-07-15 13:47:10.578311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x191e9c0 00:28:31.219 [2024-07-15 13:47:10.578408] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ab7270 00:28:31.219 [2024-07-15 13:47:10.578418] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ab7270 00:28:31.219 [2024-07-15 13:47:10.578488] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:31.477 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.477 "name": "raid_bdev1", 00:28:31.477 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:31.477 "strip_size_kb": 0, 00:28:31.477 "state": "online", 00:28:31.477 "raid_level": "raid1", 00:28:31.477 "superblock": true, 00:28:31.477 "num_base_bdevs": 2, 00:28:31.477 "num_base_bdevs_discovered": 2, 00:28:31.477 "num_base_bdevs_operational": 2, 00:28:31.477 "base_bdevs_list": [ 00:28:31.477 { 00:28:31.477 "name": "spare", 00:28:31.477 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:31.477 "is_configured": true, 00:28:31.477 "data_offset": 256, 00:28:31.477 "data_size": 7936 00:28:31.477 }, 00:28:31.477 { 00:28:31.477 "name": "BaseBdev2", 00:28:31.477 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:31.477 "is_configured": true, 00:28:31.477 "data_offset": 256, 00:28:31.477 "data_size": 7936 00:28:31.477 } 00:28:31.477 ] 00:28:31.477 }' 00:28:31.477 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.477 13:47:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.062 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:32.321 "name": "raid_bdev1", 00:28:32.321 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:32.321 "strip_size_kb": 0, 00:28:32.321 "state": "online", 00:28:32.321 "raid_level": "raid1", 00:28:32.321 "superblock": true, 00:28:32.321 "num_base_bdevs": 2, 00:28:32.321 "num_base_bdevs_discovered": 2, 00:28:32.321 "num_base_bdevs_operational": 2, 00:28:32.321 "base_bdevs_list": [ 00:28:32.321 { 00:28:32.321 "name": "spare", 00:28:32.321 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:32.321 "is_configured": true, 00:28:32.321 "data_offset": 256, 00:28:32.321 "data_size": 7936 00:28:32.321 }, 00:28:32.321 { 00:28:32.321 "name": "BaseBdev2", 00:28:32.321 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:32.321 "is_configured": true, 00:28:32.321 "data_offset": 256, 00:28:32.321 "data_size": 7936 00:28:32.321 } 00:28:32.321 ] 00:28:32.321 }' 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.321 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:32.579 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:32.579 13:47:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:32.837 [2024-07-15 13:47:12.180740] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.837 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.095 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.095 "name": "raid_bdev1", 00:28:33.095 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:33.095 "strip_size_kb": 0, 00:28:33.095 "state": "online", 00:28:33.095 "raid_level": "raid1", 00:28:33.095 "superblock": true, 00:28:33.095 "num_base_bdevs": 2, 00:28:33.095 "num_base_bdevs_discovered": 1, 00:28:33.095 "num_base_bdevs_operational": 1, 00:28:33.095 "base_bdevs_list": [ 00:28:33.095 { 00:28:33.095 "name": null, 00:28:33.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.095 "is_configured": false, 00:28:33.095 "data_offset": 256, 00:28:33.095 "data_size": 7936 00:28:33.095 }, 00:28:33.095 { 00:28:33.095 "name": "BaseBdev2", 00:28:33.095 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:33.095 "is_configured": true, 00:28:33.095 "data_offset": 256, 00:28:33.095 "data_size": 7936 00:28:33.095 } 00:28:33.095 ] 00:28:33.095 }' 00:28:33.095 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.095 13:47:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:33.664 13:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:33.974 [2024-07-15 13:47:13.247579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:33.974 [2024-07-15 13:47:13.247726] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:33.974 [2024-07-15 13:47:13.247744] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:33.974 [2024-07-15 13:47:13.247770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:33.974 [2024-07-15 13:47:13.251251] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab9670 00:28:33.974 [2024-07-15 13:47:13.252662] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:33.974 13:47:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.912 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.170 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:35.170 "name": "raid_bdev1", 00:28:35.170 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:35.170 "strip_size_kb": 0, 00:28:35.170 "state": "online", 00:28:35.170 "raid_level": "raid1", 00:28:35.170 "superblock": true, 00:28:35.170 "num_base_bdevs": 2, 00:28:35.170 "num_base_bdevs_discovered": 2, 00:28:35.170 "num_base_bdevs_operational": 2, 00:28:35.170 "process": { 00:28:35.170 "type": "rebuild", 00:28:35.170 "target": "spare", 00:28:35.170 "progress": { 00:28:35.170 "blocks": 3072, 00:28:35.170 "percent": 38 00:28:35.170 } 00:28:35.170 }, 00:28:35.170 "base_bdevs_list": [ 00:28:35.170 { 00:28:35.170 "name": "spare", 00:28:35.170 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:35.170 "is_configured": true, 00:28:35.170 "data_offset": 256, 00:28:35.170 "data_size": 7936 00:28:35.170 }, 00:28:35.170 { 00:28:35.170 "name": "BaseBdev2", 00:28:35.170 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:35.170 "is_configured": true, 00:28:35.170 "data_offset": 256, 00:28:35.170 "data_size": 7936 00:28:35.170 } 00:28:35.170 ] 00:28:35.170 }' 00:28:35.170 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:35.170 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:35.170 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:35.170 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:35.171 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:35.429 [2024-07-15 13:47:14.753906] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:35.429 [2024-07-15 13:47:14.764167] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:35.429 [2024-07-15 13:47:14.764208] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:35.429 [2024-07-15 13:47:14.764224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:35.429 [2024-07-15 13:47:14.764232] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.429 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.688 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.688 "name": "raid_bdev1", 00:28:35.688 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:35.688 "strip_size_kb": 0, 00:28:35.688 "state": "online", 00:28:35.688 "raid_level": "raid1", 00:28:35.688 "superblock": true, 00:28:35.688 "num_base_bdevs": 2, 00:28:35.688 "num_base_bdevs_discovered": 1, 00:28:35.688 "num_base_bdevs_operational": 1, 00:28:35.688 "base_bdevs_list": [ 00:28:35.688 { 00:28:35.688 "name": null, 00:28:35.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.688 "is_configured": false, 00:28:35.688 "data_offset": 256, 00:28:35.688 "data_size": 7936 00:28:35.688 }, 00:28:35.688 { 00:28:35.688 "name": "BaseBdev2", 00:28:35.688 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:35.688 "is_configured": true, 00:28:35.688 "data_offset": 256, 00:28:35.688 "data_size": 7936 00:28:35.688 } 00:28:35.688 ] 00:28:35.688 }' 00:28:35.688 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.688 13:47:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:36.295 13:47:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:36.576 [2024-07-15 13:47:15.798316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:36.576 [2024-07-15 13:47:15.798362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:36.576 [2024-07-15 13:47:15.798385] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab7c80 00:28:36.576 [2024-07-15 13:47:15.798398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:36.576 [2024-07-15 13:47:15.798582] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:36.576 [2024-07-15 13:47:15.798598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:36.576 [2024-07-15 13:47:15.798652] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:36.576 [2024-07-15 13:47:15.798664] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:36.576 [2024-07-15 13:47:15.798675] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:36.576 [2024-07-15 13:47:15.798693] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:36.576 [2024-07-15 13:47:15.802163] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab9630 00:28:36.576 [2024-07-15 13:47:15.803488] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:36.576 spare 00:28:36.576 13:47:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.512 13:47:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.770 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:37.770 "name": "raid_bdev1", 00:28:37.770 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:37.770 "strip_size_kb": 0, 00:28:37.770 "state": "online", 00:28:37.770 "raid_level": "raid1", 00:28:37.770 "superblock": true, 00:28:37.770 "num_base_bdevs": 2, 00:28:37.770 "num_base_bdevs_discovered": 2, 00:28:37.770 "num_base_bdevs_operational": 2, 00:28:37.770 "process": { 00:28:37.770 "type": "rebuild", 00:28:37.770 "target": "spare", 00:28:37.770 "progress": { 00:28:37.770 "blocks": 3072, 00:28:37.770 "percent": 38 00:28:37.770 } 00:28:37.770 }, 00:28:37.770 "base_bdevs_list": [ 00:28:37.770 { 00:28:37.770 "name": "spare", 00:28:37.770 "uuid": "b3d9083c-458c-507e-98fd-8840f02dd307", 00:28:37.770 "is_configured": true, 00:28:37.770 "data_offset": 256, 00:28:37.770 "data_size": 7936 00:28:37.770 }, 00:28:37.770 { 00:28:37.770 "name": "BaseBdev2", 00:28:37.770 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:37.770 "is_configured": true, 00:28:37.770 "data_offset": 256, 00:28:37.770 "data_size": 7936 00:28:37.770 } 00:28:37.770 ] 00:28:37.770 }' 00:28:37.770 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:37.771 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:37.771 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.771 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:37.771 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:38.029 [2024-07-15 13:47:17.392802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:38.029 [2024-07-15 13:47:17.416135] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:38.029 [2024-07-15 13:47:17.416189] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:38.029 [2024-07-15 13:47:17.416205] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:38.029 [2024-07-15 13:47:17.416214] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.029 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.287 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.288 "name": "raid_bdev1", 00:28:38.288 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:38.288 "strip_size_kb": 0, 00:28:38.288 "state": "online", 00:28:38.288 "raid_level": "raid1", 00:28:38.288 "superblock": true, 00:28:38.288 "num_base_bdevs": 2, 00:28:38.288 "num_base_bdevs_discovered": 1, 00:28:38.288 "num_base_bdevs_operational": 1, 00:28:38.288 "base_bdevs_list": [ 00:28:38.288 { 00:28:38.288 "name": null, 00:28:38.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.288 "is_configured": false, 00:28:38.288 "data_offset": 256, 00:28:38.288 "data_size": 7936 00:28:38.288 }, 00:28:38.288 { 00:28:38.288 "name": "BaseBdev2", 00:28:38.288 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:38.288 "is_configured": true, 00:28:38.288 "data_offset": 256, 00:28:38.288 "data_size": 7936 00:28:38.288 } 00:28:38.288 ] 00:28:38.288 }' 00:28:38.288 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.288 13:47:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.855 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:39.114 "name": "raid_bdev1", 00:28:39.114 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:39.114 "strip_size_kb": 0, 00:28:39.114 "state": "online", 00:28:39.114 "raid_level": "raid1", 00:28:39.114 "superblock": true, 00:28:39.114 "num_base_bdevs": 2, 00:28:39.114 "num_base_bdevs_discovered": 1, 00:28:39.114 "num_base_bdevs_operational": 1, 00:28:39.114 "base_bdevs_list": [ 00:28:39.114 { 00:28:39.114 "name": null, 00:28:39.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:39.114 "is_configured": false, 00:28:39.114 "data_offset": 256, 00:28:39.114 "data_size": 7936 00:28:39.114 }, 00:28:39.114 { 00:28:39.114 "name": "BaseBdev2", 00:28:39.114 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:39.114 "is_configured": true, 00:28:39.114 "data_offset": 256, 00:28:39.114 "data_size": 7936 00:28:39.114 } 00:28:39.114 ] 00:28:39.114 }' 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:39.114 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:39.372 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:39.631 [2024-07-15 13:47:18.976123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:39.631 [2024-07-15 13:47:18.976168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:39.631 [2024-07-15 13:47:18.976189] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191f0f0 00:28:39.631 [2024-07-15 13:47:18.976202] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:39.631 [2024-07-15 13:47:18.976365] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:39.631 [2024-07-15 13:47:18.976381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:39.631 [2024-07-15 13:47:18.976426] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:39.631 [2024-07-15 13:47:18.976438] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:39.631 [2024-07-15 13:47:18.976449] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:39.631 BaseBdev1 00:28:39.631 13:47:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:41.005 13:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:41.005 13:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:41.005 13:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:41.005 13:47:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.005 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.005 "name": "raid_bdev1", 00:28:41.005 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:41.005 "strip_size_kb": 0, 00:28:41.005 "state": "online", 00:28:41.005 "raid_level": "raid1", 00:28:41.005 "superblock": true, 00:28:41.005 "num_base_bdevs": 2, 00:28:41.005 "num_base_bdevs_discovered": 1, 00:28:41.005 "num_base_bdevs_operational": 1, 00:28:41.005 "base_bdevs_list": [ 00:28:41.005 { 00:28:41.005 "name": null, 00:28:41.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.005 "is_configured": false, 00:28:41.005 "data_offset": 256, 00:28:41.005 "data_size": 7936 00:28:41.005 }, 00:28:41.005 { 00:28:41.005 "name": "BaseBdev2", 00:28:41.006 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:41.006 "is_configured": true, 00:28:41.006 "data_offset": 256, 00:28:41.006 "data_size": 7936 00:28:41.006 } 00:28:41.006 ] 00:28:41.006 }' 00:28:41.006 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.006 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.573 13:47:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:41.831 "name": "raid_bdev1", 00:28:41.831 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:41.831 "strip_size_kb": 0, 00:28:41.831 "state": "online", 00:28:41.831 "raid_level": "raid1", 00:28:41.831 "superblock": true, 00:28:41.831 "num_base_bdevs": 2, 00:28:41.831 "num_base_bdevs_discovered": 1, 00:28:41.831 "num_base_bdevs_operational": 1, 00:28:41.831 "base_bdevs_list": [ 00:28:41.831 { 00:28:41.831 "name": null, 00:28:41.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.831 "is_configured": false, 00:28:41.831 "data_offset": 256, 00:28:41.831 "data_size": 7936 00:28:41.831 }, 00:28:41.831 { 00:28:41.831 "name": "BaseBdev2", 00:28:41.831 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:41.831 "is_configured": true, 00:28:41.831 "data_offset": 256, 00:28:41.831 "data_size": 7936 00:28:41.831 } 00:28:41.831 ] 00:28:41.831 }' 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:41.831 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:42.089 [2024-07-15 13:47:21.394566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:42.089 [2024-07-15 13:47:21.394692] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:42.089 [2024-07-15 13:47:21.394708] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:42.089 request: 00:28:42.089 { 00:28:42.089 "base_bdev": "BaseBdev1", 00:28:42.089 "raid_bdev": "raid_bdev1", 00:28:42.089 "method": "bdev_raid_add_base_bdev", 00:28:42.089 "req_id": 1 00:28:42.089 } 00:28:42.089 Got JSON-RPC error response 00:28:42.090 response: 00:28:42.090 { 00:28:42.090 "code": -22, 00:28:42.090 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:42.090 } 00:28:42.090 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:42.090 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:42.090 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:42.090 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:42.090 13:47:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.022 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.280 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.280 "name": "raid_bdev1", 00:28:43.280 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:43.280 "strip_size_kb": 0, 00:28:43.280 "state": "online", 00:28:43.280 "raid_level": "raid1", 00:28:43.280 "superblock": true, 00:28:43.280 "num_base_bdevs": 2, 00:28:43.280 "num_base_bdevs_discovered": 1, 00:28:43.280 "num_base_bdevs_operational": 1, 00:28:43.280 "base_bdevs_list": [ 00:28:43.280 { 00:28:43.280 "name": null, 00:28:43.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:43.280 "is_configured": false, 00:28:43.280 "data_offset": 256, 00:28:43.280 "data_size": 7936 00:28:43.280 }, 00:28:43.280 { 00:28:43.280 "name": "BaseBdev2", 00:28:43.280 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:43.280 "is_configured": true, 00:28:43.280 "data_offset": 256, 00:28:43.280 "data_size": 7936 00:28:43.280 } 00:28:43.280 ] 00:28:43.280 }' 00:28:43.280 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.280 13:47:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:43.845 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:43.845 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.845 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:43.845 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:43.845 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:44.103 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.103 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.103 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:44.103 "name": "raid_bdev1", 00:28:44.103 "uuid": "1c0df4f3-5ae5-448f-a70d-192287711385", 00:28:44.103 "strip_size_kb": 0, 00:28:44.103 "state": "online", 00:28:44.103 "raid_level": "raid1", 00:28:44.103 "superblock": true, 00:28:44.103 "num_base_bdevs": 2, 00:28:44.103 "num_base_bdevs_discovered": 1, 00:28:44.103 "num_base_bdevs_operational": 1, 00:28:44.103 "base_bdevs_list": [ 00:28:44.103 { 00:28:44.103 "name": null, 00:28:44.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:44.103 "is_configured": false, 00:28:44.103 "data_offset": 256, 00:28:44.103 "data_size": 7936 00:28:44.103 }, 00:28:44.103 { 00:28:44.103 "name": "BaseBdev2", 00:28:44.103 "uuid": "acbb5b53-9f3e-5114-a494-022bc1afe077", 00:28:44.103 "is_configured": true, 00:28:44.103 "data_offset": 256, 00:28:44.103 "data_size": 7936 00:28:44.103 } 00:28:44.103 ] 00:28:44.103 }' 00:28:44.103 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2227397 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2227397 ']' 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2227397 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2227397 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2227397' 00:28:44.362 killing process with pid 2227397 00:28:44.362 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2227397 00:28:44.362 Received shutdown signal, test time was about 60.000000 seconds 00:28:44.362 00:28:44.362 Latency(us) 00:28:44.362 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:44.362 =================================================================================================================== 00:28:44.362 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:44.362 [2024-07-15 13:47:23.651046] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:44.362 [2024-07-15 13:47:23.651141] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:44.362 [2024-07-15 13:47:23.651187] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:44.362 [2024-07-15 13:47:23.651201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ab7270 name raid_bdev1, state of 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2227397 00:28:44.362 fline 00:28:44.362 [2024-07-15 13:47:23.682787] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:44.621 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:28:44.621 00:28:44.621 real 0m28.675s 00:28:44.621 user 0m45.596s 00:28:44.621 sys 0m3.790s 00:28:44.621 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:44.621 13:47:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:44.621 ************************************ 00:28:44.621 END TEST raid_rebuild_test_sb_md_interleaved 00:28:44.621 ************************************ 00:28:44.621 13:47:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:44.621 13:47:23 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:28:44.621 13:47:23 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:28:44.621 13:47:23 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2227397 ']' 00:28:44.621 13:47:23 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2227397 00:28:44.621 13:47:23 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:28:44.621 00:28:44.621 real 18m37.804s 00:28:44.621 user 31m35.246s 00:28:44.621 sys 3m21.811s 00:28:44.621 13:47:23 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:44.621 13:47:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:44.621 ************************************ 00:28:44.621 END TEST bdev_raid 00:28:44.621 ************************************ 00:28:44.621 13:47:24 -- common/autotest_common.sh@1142 -- # return 0 00:28:44.621 13:47:24 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:44.621 13:47:24 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:44.621 13:47:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:44.621 13:47:24 -- common/autotest_common.sh@10 -- # set +x 00:28:44.879 ************************************ 00:28:44.879 START TEST bdevperf_config 00:28:44.879 ************************************ 00:28:44.879 13:47:24 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:44.879 * Looking for test storage... 00:28:44.879 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:44.879 00:28:44.879 13:47:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:44.880 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:44.880 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:44.880 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:44.880 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:44.880 13:47:24 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 13:47:24.282771] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:48.167 [2024-07-15 13:47:24.282846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2231553 ] 00:28:48.167 Using job config with 4 jobs 00:28:48.167 [2024-07-15 13:47:24.436049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.167 [2024-07-15 13:47:24.561905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.167 cpumask for '\''job0'\'' is too big 00:28:48.167 cpumask for '\''job1'\'' is too big 00:28:48.167 cpumask for '\''job2'\'' is too big 00:28:48.167 cpumask for '\''job3'\'' is too big 00:28:48.167 Running I/O for 2 seconds... 00:28:48.167 00:28:48.167 Latency(us) 00:28:48.167 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23991.65 23.43 0.00 0.00 10662.01 1852.10 16298.52 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23969.50 23.41 0.00 0.00 10647.73 1823.61 14417.92 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23947.57 23.39 0.00 0.00 10632.86 1823.61 12594.31 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 24019.95 23.46 0.00 0.00 10577.44 940.30 10998.65 00:28:48.167 =================================================================================================================== 00:28:48.167 Total : 95928.67 93.68 0.00 0.00 10629.94 940.30 16298.52' 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 13:47:24.282771] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:48.167 [2024-07-15 13:47:24.282846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2231553 ] 00:28:48.167 Using job config with 4 jobs 00:28:48.167 [2024-07-15 13:47:24.436049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.167 [2024-07-15 13:47:24.561905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.167 cpumask for '\''job0'\'' is too big 00:28:48.167 cpumask for '\''job1'\'' is too big 00:28:48.167 cpumask for '\''job2'\'' is too big 00:28:48.167 cpumask for '\''job3'\'' is too big 00:28:48.167 Running I/O for 2 seconds... 00:28:48.167 00:28:48.167 Latency(us) 00:28:48.167 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23991.65 23.43 0.00 0.00 10662.01 1852.10 16298.52 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23969.50 23.41 0.00 0.00 10647.73 1823.61 14417.92 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23947.57 23.39 0.00 0.00 10632.86 1823.61 12594.31 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 24019.95 23.46 0.00 0.00 10577.44 940.30 10998.65 00:28:48.167 =================================================================================================================== 00:28:48.167 Total : 95928.67 93.68 0.00 0.00 10629.94 940.30 16298.52' 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:47:24.282771] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:48.167 [2024-07-15 13:47:24.282846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2231553 ] 00:28:48.167 Using job config with 4 jobs 00:28:48.167 [2024-07-15 13:47:24.436049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.167 [2024-07-15 13:47:24.561905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.167 cpumask for '\''job0'\'' is too big 00:28:48.167 cpumask for '\''job1'\'' is too big 00:28:48.167 cpumask for '\''job2'\'' is too big 00:28:48.167 cpumask for '\''job3'\'' is too big 00:28:48.167 Running I/O for 2 seconds... 00:28:48.167 00:28:48.167 Latency(us) 00:28:48.167 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23991.65 23.43 0.00 0.00 10662.01 1852.10 16298.52 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23969.50 23.41 0.00 0.00 10647.73 1823.61 14417.92 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 23947.57 23.39 0.00 0.00 10632.86 1823.61 12594.31 00:28:48.167 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:48.167 Malloc0 : 2.02 24019.95 23.46 0.00 0.00 10577.44 940.30 10998.65 00:28:48.167 =================================================================================================================== 00:28:48.167 Total : 95928.67 93.68 0.00 0.00 10629.94 940.30 16298.52' 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:28:48.167 13:47:27 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:48.167 [2024-07-15 13:47:27.075190] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:48.167 [2024-07-15 13:47:27.075256] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2231912 ] 00:28:48.167 [2024-07-15 13:47:27.216914] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.167 [2024-07-15 13:47:27.324273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.167 cpumask for 'job0' is too big 00:28:48.167 cpumask for 'job1' is too big 00:28:48.167 cpumask for 'job2' is too big 00:28:48.167 cpumask for 'job3' is too big 00:28:50.698 13:47:29 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:28:50.698 Running I/O for 2 seconds... 00:28:50.698 00:28:50.698 Latency(us) 00:28:50.698 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:50.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:50.698 Malloc0 : 2.02 24130.72 23.57 0.00 0.00 10597.06 1837.86 16412.49 00:28:50.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:50.698 Malloc0 : 2.02 24108.54 23.54 0.00 0.00 10582.95 1837.86 14531.90 00:28:50.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:50.698 Malloc0 : 2.02 24086.43 23.52 0.00 0.00 10568.73 2008.82 12537.32 00:28:50.698 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:50.698 Malloc0 : 2.02 24158.85 23.59 0.00 0.00 10513.10 933.18 10884.67 00:28:50.698 =================================================================================================================== 00:28:50.698 Total : 96484.55 94.22 0.00 0.00 10565.39 933.18 16412.49' 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:50.699 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:50.699 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:50.699 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:50.699 13:47:29 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 13:47:29.799959] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:53.225 [2024-07-15 13:47:29.800026] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232264 ] 00:28:53.225 Using job config with 3 jobs 00:28:53.225 [2024-07-15 13:47:29.941087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.225 [2024-07-15 13:47:30.065354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.225 cpumask for '\''job0'\'' is too big 00:28:53.225 cpumask for '\''job1'\'' is too big 00:28:53.225 cpumask for '\''job2'\'' is too big 00:28:53.225 Running I/O for 2 seconds... 00:28:53.225 00:28:53.225 Latency(us) 00:28:53.225 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32722.41 31.96 0.00 0.00 7820.30 1795.12 11511.54 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32692.34 31.93 0.00 0.00 7810.39 1780.87 9687.93 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.02 32746.55 31.98 0.00 0.00 7780.63 926.05 8092.27 00:28:53.225 =================================================================================================================== 00:28:53.225 Total : 98161.31 95.86 0.00 0.00 7803.75 926.05 11511.54' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 13:47:29.799959] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:53.225 [2024-07-15 13:47:29.800026] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232264 ] 00:28:53.225 Using job config with 3 jobs 00:28:53.225 [2024-07-15 13:47:29.941087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.225 [2024-07-15 13:47:30.065354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.225 cpumask for '\''job0'\'' is too big 00:28:53.225 cpumask for '\''job1'\'' is too big 00:28:53.225 cpumask for '\''job2'\'' is too big 00:28:53.225 Running I/O for 2 seconds... 00:28:53.225 00:28:53.225 Latency(us) 00:28:53.225 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32722.41 31.96 0.00 0.00 7820.30 1795.12 11511.54 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32692.34 31.93 0.00 0.00 7810.39 1780.87 9687.93 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.02 32746.55 31.98 0.00 0.00 7780.63 926.05 8092.27 00:28:53.225 =================================================================================================================== 00:28:53.225 Total : 98161.31 95.86 0.00 0.00 7803.75 926.05 11511.54' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:47:29.799959] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:53.225 [2024-07-15 13:47:29.800026] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232264 ] 00:28:53.225 Using job config with 3 jobs 00:28:53.225 [2024-07-15 13:47:29.941087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.225 [2024-07-15 13:47:30.065354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.225 cpumask for '\''job0'\'' is too big 00:28:53.225 cpumask for '\''job1'\'' is too big 00:28:53.225 cpumask for '\''job2'\'' is too big 00:28:53.225 Running I/O for 2 seconds... 00:28:53.225 00:28:53.225 Latency(us) 00:28:53.225 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32722.41 31.96 0.00 0.00 7820.30 1795.12 11511.54 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.01 32692.34 31.93 0.00 0.00 7810.39 1780.87 9687.93 00:28:53.225 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:53.225 Malloc0 : 2.02 32746.55 31.98 0.00 0.00 7780.63 926.05 8092.27 00:28:53.225 =================================================================================================================== 00:28:53.225 Total : 98161.31 95.86 0.00 0.00 7803.75 926.05 11511.54' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:53.225 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:53.225 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:53.225 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:53.225 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:53.225 00:28:53.225 13:47:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:53.226 13:47:32 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:56.537 13:47:35 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 13:47:32.559182] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:56.538 [2024-07-15 13:47:32.559250] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232620 ] 00:28:56.538 Using job config with 4 jobs 00:28:56.538 [2024-07-15 13:47:32.704295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:56.538 [2024-07-15 13:47:32.820346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.538 cpumask for '\''job0'\'' is too big 00:28:56.538 cpumask for '\''job1'\'' is too big 00:28:56.538 cpumask for '\''job2'\'' is too big 00:28:56.538 cpumask for '\''job3'\'' is too big 00:28:56.538 Running I/O for 2 seconds... 00:28:56.538 00:28:56.538 Latency(us) 00:28:56.538 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.02 12018.98 11.74 0.00 0.00 21283.56 3903.67 33052.94 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.04 12023.57 11.74 0.00 0.00 21254.34 4729.99 33052.94 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 12012.78 11.73 0.00 0.00 21197.10 3789.69 29063.79 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 12001.73 11.72 0.00 0.00 21199.38 4616.01 29063.79 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11990.96 11.71 0.00 0.00 21142.21 3789.69 25302.59 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 11979.96 11.70 0.00 0.00 21140.29 4616.01 25302.59 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11969.22 11.69 0.00 0.00 21082.02 3789.69 21655.37 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.06 11958.26 11.68 0.00 0.00 21081.83 4616.01 21655.37 00:28:56.538 =================================================================================================================== 00:28:56.538 Total : 95955.46 93.71 0.00 0.00 21172.45 3789.69 33052.94' 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 13:47:32.559182] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:56.538 [2024-07-15 13:47:32.559250] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232620 ] 00:28:56.538 Using job config with 4 jobs 00:28:56.538 [2024-07-15 13:47:32.704295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:56.538 [2024-07-15 13:47:32.820346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.538 cpumask for '\''job0'\'' is too big 00:28:56.538 cpumask for '\''job1'\'' is too big 00:28:56.538 cpumask for '\''job2'\'' is too big 00:28:56.538 cpumask for '\''job3'\'' is too big 00:28:56.538 Running I/O for 2 seconds... 00:28:56.538 00:28:56.538 Latency(us) 00:28:56.538 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.02 12018.98 11.74 0.00 0.00 21283.56 3903.67 33052.94 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.04 12023.57 11.74 0.00 0.00 21254.34 4729.99 33052.94 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 12012.78 11.73 0.00 0.00 21197.10 3789.69 29063.79 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 12001.73 11.72 0.00 0.00 21199.38 4616.01 29063.79 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11990.96 11.71 0.00 0.00 21142.21 3789.69 25302.59 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 11979.96 11.70 0.00 0.00 21140.29 4616.01 25302.59 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11969.22 11.69 0.00 0.00 21082.02 3789.69 21655.37 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.06 11958.26 11.68 0.00 0.00 21081.83 4616.01 21655.37 00:28:56.538 =================================================================================================================== 00:28:56.538 Total : 95955.46 93.71 0.00 0.00 21172.45 3789.69 33052.94' 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 13:47:32.559182] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:56.538 [2024-07-15 13:47:32.559250] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2232620 ] 00:28:56.538 Using job config with 4 jobs 00:28:56.538 [2024-07-15 13:47:32.704295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:56.538 [2024-07-15 13:47:32.820346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.538 cpumask for '\''job0'\'' is too big 00:28:56.538 cpumask for '\''job1'\'' is too big 00:28:56.538 cpumask for '\''job2'\'' is too big 00:28:56.538 cpumask for '\''job3'\'' is too big 00:28:56.538 Running I/O for 2 seconds... 00:28:56.538 00:28:56.538 Latency(us) 00:28:56.538 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.02 12018.98 11.74 0.00 0.00 21283.56 3903.67 33052.94 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.04 12023.57 11.74 0.00 0.00 21254.34 4729.99 33052.94 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 12012.78 11.73 0.00 0.00 21197.10 3789.69 29063.79 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 12001.73 11.72 0.00 0.00 21199.38 4616.01 29063.79 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11990.96 11.71 0.00 0.00 21142.21 3789.69 25302.59 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.05 11979.96 11.70 0.00 0.00 21140.29 4616.01 25302.59 00:28:56.538 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc0 : 2.05 11969.22 11.69 0.00 0.00 21082.02 3789.69 21655.37 00:28:56.538 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:56.538 Malloc1 : 2.06 11958.26 11.68 0.00 0.00 21081.83 4616.01 21655.37 00:28:56.538 =================================================================================================================== 00:28:56.538 Total : 95955.46 93.71 0.00 0.00 21172.45 3789.69 33052.94' 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:56.538 13:47:35 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:28:56.538 00:28:56.538 real 0m11.231s 00:28:56.538 user 0m9.888s 00:28:56.538 sys 0m1.167s 00:28:56.538 13:47:35 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:56.538 13:47:35 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:28:56.538 ************************************ 00:28:56.538 END TEST bdevperf_config 00:28:56.538 ************************************ 00:28:56.538 13:47:35 -- common/autotest_common.sh@1142 -- # return 0 00:28:56.538 13:47:35 -- spdk/autotest.sh@192 -- # uname -s 00:28:56.538 13:47:35 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:28:56.538 13:47:35 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:56.538 13:47:35 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:56.538 13:47:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:56.538 13:47:35 -- common/autotest_common.sh@10 -- # set +x 00:28:56.538 ************************************ 00:28:56.538 START TEST reactor_set_interrupt 00:28:56.538 ************************************ 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:56.538 * Looking for test storage... 00:28:56.538 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:56.538 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:56.538 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:56.539 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:56.539 13:47:35 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:56.539 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:56.539 13:47:35 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:56.539 #define SPDK_CONFIG_H 00:28:56.539 #define SPDK_CONFIG_APPS 1 00:28:56.539 #define SPDK_CONFIG_ARCH native 00:28:56.539 #undef SPDK_CONFIG_ASAN 00:28:56.539 #undef SPDK_CONFIG_AVAHI 00:28:56.539 #undef SPDK_CONFIG_CET 00:28:56.539 #define SPDK_CONFIG_COVERAGE 1 00:28:56.539 #define SPDK_CONFIG_CROSS_PREFIX 00:28:56.539 #define SPDK_CONFIG_CRYPTO 1 00:28:56.539 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:56.539 #undef SPDK_CONFIG_CUSTOMOCF 00:28:56.539 #undef SPDK_CONFIG_DAOS 00:28:56.539 #define SPDK_CONFIG_DAOS_DIR 00:28:56.539 #define SPDK_CONFIG_DEBUG 1 00:28:56.539 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:56.539 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:56.539 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:56.539 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:56.539 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:56.539 #undef SPDK_CONFIG_DPDK_UADK 00:28:56.539 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:56.539 #define SPDK_CONFIG_EXAMPLES 1 00:28:56.539 #undef SPDK_CONFIG_FC 00:28:56.539 #define SPDK_CONFIG_FC_PATH 00:28:56.539 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:56.539 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:56.539 #undef SPDK_CONFIG_FUSE 00:28:56.539 #undef SPDK_CONFIG_FUZZER 00:28:56.539 #define SPDK_CONFIG_FUZZER_LIB 00:28:56.539 #undef SPDK_CONFIG_GOLANG 00:28:56.539 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:56.539 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:56.539 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:56.539 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:56.539 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:56.539 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:56.539 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:56.539 #define SPDK_CONFIG_IDXD 1 00:28:56.539 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:56.539 #define SPDK_CONFIG_IPSEC_MB 1 00:28:56.539 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:56.539 #define SPDK_CONFIG_ISAL 1 00:28:56.539 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:56.539 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:56.539 #define SPDK_CONFIG_LIBDIR 00:28:56.539 #undef SPDK_CONFIG_LTO 00:28:56.539 #define SPDK_CONFIG_MAX_LCORES 128 00:28:56.539 #define SPDK_CONFIG_NVME_CUSE 1 00:28:56.540 #undef SPDK_CONFIG_OCF 00:28:56.540 #define SPDK_CONFIG_OCF_PATH 00:28:56.540 #define SPDK_CONFIG_OPENSSL_PATH 00:28:56.540 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:56.540 #define SPDK_CONFIG_PGO_DIR 00:28:56.540 #undef SPDK_CONFIG_PGO_USE 00:28:56.540 #define SPDK_CONFIG_PREFIX /usr/local 00:28:56.540 #undef SPDK_CONFIG_RAID5F 00:28:56.540 #undef SPDK_CONFIG_RBD 00:28:56.540 #define SPDK_CONFIG_RDMA 1 00:28:56.540 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:56.540 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:56.540 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:56.540 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:56.540 #define SPDK_CONFIG_SHARED 1 00:28:56.540 #undef SPDK_CONFIG_SMA 00:28:56.540 #define SPDK_CONFIG_TESTS 1 00:28:56.540 #undef SPDK_CONFIG_TSAN 00:28:56.540 #define SPDK_CONFIG_UBLK 1 00:28:56.540 #define SPDK_CONFIG_UBSAN 1 00:28:56.540 #undef SPDK_CONFIG_UNIT_TESTS 00:28:56.540 #undef SPDK_CONFIG_URING 00:28:56.540 #define SPDK_CONFIG_URING_PATH 00:28:56.540 #undef SPDK_CONFIG_URING_ZNS 00:28:56.540 #undef SPDK_CONFIG_USDT 00:28:56.540 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:56.540 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:56.540 #undef SPDK_CONFIG_VFIO_USER 00:28:56.540 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:56.540 #define SPDK_CONFIG_VHOST 1 00:28:56.540 #define SPDK_CONFIG_VIRTIO 1 00:28:56.540 #undef SPDK_CONFIG_VTUNE 00:28:56.540 #define SPDK_CONFIG_VTUNE_DIR 00:28:56.540 #define SPDK_CONFIG_WERROR 1 00:28:56.540 #define SPDK_CONFIG_WPDK_DIR 00:28:56.540 #undef SPDK_CONFIG_XNVME 00:28:56.540 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:56.540 13:47:35 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:56.540 13:47:35 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.540 13:47:35 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.540 13:47:35 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.540 13:47:35 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:28:56.540 13:47:35 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:56.540 13:47:35 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:28:56.540 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2233013 ]] 00:28:56.541 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2233013 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.9tS434 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.9tS434/tests/interrupt /tmp/spdk.9tS434 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88736313344 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5772201984 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892251136 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9453568 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253417984 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=839680 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:28:56.542 * Looking for test storage... 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88736313344 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7986794496 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.542 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:56.542 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:56.542 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2233073 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:56.543 13:47:35 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2233073 /var/tmp/spdk.sock 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2233073 ']' 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:56.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:56.543 13:47:35 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:56.543 [2024-07-15 13:47:35.724035] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:28:56.543 [2024-07-15 13:47:35.724110] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2233073 ] 00:28:56.543 [2024-07-15 13:47:35.856229] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:56.801 [2024-07-15 13:47:35.960979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:56.801 [2024-07-15 13:47:35.961064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:56.801 [2024-07-15 13:47:35.961068] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.801 [2024-07-15 13:47:36.035744] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:57.366 13:47:36 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:57.366 13:47:36 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:28:57.366 13:47:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:28:57.366 13:47:36 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:57.624 Malloc0 00:28:57.624 Malloc1 00:28:57.624 Malloc2 00:28:57.624 13:47:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:28:57.624 13:47:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:57.624 13:47:36 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:57.624 13:47:36 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:57.624 5000+0 records in 00:28:57.624 5000+0 records out 00:28:57.624 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0262008 s, 391 MB/s 00:28:57.624 13:47:36 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:57.882 AIO0 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2233073 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2233073 without_thd 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2233073 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:57.882 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:58.140 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:58.399 spdk_thread ids are 1 on reactor0. 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233073 0 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233073 0 idle 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:58.399 13:47:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233073 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.39 reactor_0' 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233073 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.39 reactor_0 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233073 1 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233073 1 idle 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:28:58.657 13:47:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233134 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233134 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233073 2 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233073 2 idle 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233135 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233135 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:28:58.937 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:28:59.195 [2024-07-15 13:47:38.553908] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:59.195 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:59.453 [2024-07-15 13:47:38.801598] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:59.453 [2024-07-15 13:47:38.801869] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:59.453 13:47:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:59.710 [2024-07-15 13:47:39.045586] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:59.710 [2024-07-15 13:47:39.045729] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2233073 0 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2233073 0 busy 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:59.710 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233073 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0' 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233073 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2233073 2 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2233073 2 busy 00:28:59.968 13:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:28:59.969 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233135 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233135 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:00.226 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:00.226 [2024-07-15 13:47:39.649590] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:00.226 [2024-07-15 13:47:39.649701] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2233073 2 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233073 2 idle 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233135 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233135 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:00.484 13:47:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:00.743 [2024-07-15 13:47:40.077595] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:00.743 [2024-07-15 13:47:40.077744] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:00.743 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:29:00.743 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:29:00.743 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:29:01.002 [2024-07-15 13:47:40.321975] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2233073 0 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233073 0 idle 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233073 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233073 -w 256 00:29:01.002 13:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:01.260 13:47:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233073 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.67 reactor_0' 00:29:01.260 13:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233073 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.67 reactor_0 00:29:01.260 13:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:01.260 13:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:01.260 13:47:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:29:01.261 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2233073 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2233073 ']' 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2233073 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2233073 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2233073' 00:29:01.261 killing process with pid 2233073 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2233073 00:29:01.261 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2233073 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2233832 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:01.519 13:47:40 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2233832 /var/tmp/spdk.sock 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2233832 ']' 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:01.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:01.519 13:47:40 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:01.519 [2024-07-15 13:47:40.879054] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:01.519 [2024-07-15 13:47:40.879132] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2233832 ] 00:29:01.777 [2024-07-15 13:47:41.009820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:01.777 [2024-07-15 13:47:41.114305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:01.777 [2024-07-15 13:47:41.114392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:01.777 [2024-07-15 13:47:41.114397] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:01.777 [2024-07-15 13:47:41.188931] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:02.710 13:47:41 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:02.710 13:47:41 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:02.710 13:47:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:29:02.710 13:47:41 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:02.710 Malloc0 00:29:02.710 Malloc1 00:29:02.710 Malloc2 00:29:02.710 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:29:02.710 13:47:42 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:02.710 13:47:42 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:02.710 13:47:42 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:02.969 5000+0 records in 00:29:02.969 5000+0 records out 00:29:02.969 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0278334 s, 368 MB/s 00:29:02.969 13:47:42 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:03.227 AIO0 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2233832 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2233832 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2233832 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:03.227 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:03.484 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:03.743 spdk_thread ids are 1 on reactor0. 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233832 0 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233832 0 idle 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:03.743 13:47:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:03.744 13:47:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233832 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.40 reactor_0' 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233832 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.40 reactor_0 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233832 1 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233832 1 idle 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:03.744 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233835 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1' 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233835 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2233832 2 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233832 2 idle 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:04.007 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233836 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2' 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233836 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:29:04.267 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:04.525 [2024-07-15 13:47:43.718997] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:04.525 [2024-07-15 13:47:43.719188] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:29:04.525 [2024-07-15 13:47:43.719319] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:04.525 [2024-07-15 13:47:43.899365] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:04.525 [2024-07-15 13:47:43.899553] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2233832 0 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2233832 0 busy 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:04.525 13:47:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233832 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.76 reactor_0' 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233832 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.76 reactor_0 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2233832 2 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2233832 2 busy 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:04.783 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233836 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.35 reactor_2' 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233836 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.35 reactor_2 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:05.040 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:05.298 [2024-07-15 13:47:44.501089] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:05.298 [2024-07-15 13:47:44.501206] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2233832 2 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233832 2 idle 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233836 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.60 reactor_2' 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233836 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.60 reactor_2 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:05.298 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:05.556 [2024-07-15 13:47:44.926176] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:05.556 [2024-07-15 13:47:44.926362] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:29:05.556 [2024-07-15 13:47:44.926387] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2233832 0 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2233832 0 idle 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2233832 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:05.556 13:47:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2233832 -w 256 00:29:05.557 13:47:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2233832 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.61 reactor_0' 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2233832 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.61 reactor_0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:29:05.815 13:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2233832 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2233832 ']' 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2233832 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2233832 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2233832' 00:29:05.815 killing process with pid 2233832 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2233832 00:29:05.815 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2233832 00:29:06.073 13:47:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:29:06.073 13:47:45 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:06.073 00:29:06.073 real 0m10.058s 00:29:06.073 user 0m9.374s 00:29:06.073 sys 0m2.179s 00:29:06.073 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:06.073 13:47:45 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:06.073 ************************************ 00:29:06.073 END TEST reactor_set_interrupt 00:29:06.073 ************************************ 00:29:06.073 13:47:45 -- common/autotest_common.sh@1142 -- # return 0 00:29:06.073 13:47:45 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:06.073 13:47:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:06.073 13:47:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:06.073 13:47:45 -- common/autotest_common.sh@10 -- # set +x 00:29:06.334 ************************************ 00:29:06.334 START TEST reap_unregistered_poller 00:29:06.334 ************************************ 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:06.334 * Looking for test storage... 00:29:06.334 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:06.334 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:06.334 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:06.334 13:47:45 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:06.335 13:47:45 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:06.335 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:06.335 13:47:45 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:06.335 #define SPDK_CONFIG_H 00:29:06.335 #define SPDK_CONFIG_APPS 1 00:29:06.335 #define SPDK_CONFIG_ARCH native 00:29:06.335 #undef SPDK_CONFIG_ASAN 00:29:06.335 #undef SPDK_CONFIG_AVAHI 00:29:06.335 #undef SPDK_CONFIG_CET 00:29:06.335 #define SPDK_CONFIG_COVERAGE 1 00:29:06.335 #define SPDK_CONFIG_CROSS_PREFIX 00:29:06.335 #define SPDK_CONFIG_CRYPTO 1 00:29:06.335 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:06.335 #undef SPDK_CONFIG_CUSTOMOCF 00:29:06.335 #undef SPDK_CONFIG_DAOS 00:29:06.335 #define SPDK_CONFIG_DAOS_DIR 00:29:06.335 #define SPDK_CONFIG_DEBUG 1 00:29:06.335 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:06.335 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:06.335 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:06.335 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:06.335 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:06.335 #undef SPDK_CONFIG_DPDK_UADK 00:29:06.335 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:06.335 #define SPDK_CONFIG_EXAMPLES 1 00:29:06.335 #undef SPDK_CONFIG_FC 00:29:06.335 #define SPDK_CONFIG_FC_PATH 00:29:06.335 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:06.335 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:06.335 #undef SPDK_CONFIG_FUSE 00:29:06.335 #undef SPDK_CONFIG_FUZZER 00:29:06.335 #define SPDK_CONFIG_FUZZER_LIB 00:29:06.335 #undef SPDK_CONFIG_GOLANG 00:29:06.335 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:06.335 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:06.335 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:06.335 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:06.335 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:06.335 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:06.335 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:06.335 #define SPDK_CONFIG_IDXD 1 00:29:06.335 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:06.335 #define SPDK_CONFIG_IPSEC_MB 1 00:29:06.335 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:06.335 #define SPDK_CONFIG_ISAL 1 00:29:06.335 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:06.335 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:06.335 #define SPDK_CONFIG_LIBDIR 00:29:06.335 #undef SPDK_CONFIG_LTO 00:29:06.335 #define SPDK_CONFIG_MAX_LCORES 128 00:29:06.335 #define SPDK_CONFIG_NVME_CUSE 1 00:29:06.335 #undef SPDK_CONFIG_OCF 00:29:06.335 #define SPDK_CONFIG_OCF_PATH 00:29:06.335 #define SPDK_CONFIG_OPENSSL_PATH 00:29:06.335 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:06.335 #define SPDK_CONFIG_PGO_DIR 00:29:06.335 #undef SPDK_CONFIG_PGO_USE 00:29:06.335 #define SPDK_CONFIG_PREFIX /usr/local 00:29:06.335 #undef SPDK_CONFIG_RAID5F 00:29:06.335 #undef SPDK_CONFIG_RBD 00:29:06.335 #define SPDK_CONFIG_RDMA 1 00:29:06.335 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:06.335 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:06.335 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:06.335 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:06.335 #define SPDK_CONFIG_SHARED 1 00:29:06.335 #undef SPDK_CONFIG_SMA 00:29:06.335 #define SPDK_CONFIG_TESTS 1 00:29:06.335 #undef SPDK_CONFIG_TSAN 00:29:06.335 #define SPDK_CONFIG_UBLK 1 00:29:06.335 #define SPDK_CONFIG_UBSAN 1 00:29:06.335 #undef SPDK_CONFIG_UNIT_TESTS 00:29:06.335 #undef SPDK_CONFIG_URING 00:29:06.335 #define SPDK_CONFIG_URING_PATH 00:29:06.335 #undef SPDK_CONFIG_URING_ZNS 00:29:06.335 #undef SPDK_CONFIG_USDT 00:29:06.335 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:06.335 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:06.335 #undef SPDK_CONFIG_VFIO_USER 00:29:06.335 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:06.335 #define SPDK_CONFIG_VHOST 1 00:29:06.335 #define SPDK_CONFIG_VIRTIO 1 00:29:06.335 #undef SPDK_CONFIG_VTUNE 00:29:06.336 #define SPDK_CONFIG_VTUNE_DIR 00:29:06.336 #define SPDK_CONFIG_WERROR 1 00:29:06.336 #define SPDK_CONFIG_WPDK_DIR 00:29:06.336 #undef SPDK_CONFIG_XNVME 00:29:06.336 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:06.336 13:47:45 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:06.336 13:47:45 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:06.336 13:47:45 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:06.336 13:47:45 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:06.336 13:47:45 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:29:06.336 13:47:45 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:06.336 13:47:45 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:29:06.336 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2234620 ]] 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2234620 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:06.337 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.2m5LrP 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.2m5LrP/tests/interrupt /tmp/spdk.2m5LrP 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88736157696 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5772357632 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892251136 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9453568 00:29:06.338 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253417984 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=839680 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:06.597 * Looking for test storage... 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88736157696 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7986950144 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.597 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2234661 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2234661 /var/tmp/spdk.sock 00:29:06.597 13:47:45 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2234661 ']' 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:06.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:06.597 13:47:45 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:06.597 [2024-07-15 13:47:45.821615] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:06.597 [2024-07-15 13:47:45.821683] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2234661 ] 00:29:06.597 [2024-07-15 13:47:45.953206] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:06.856 [2024-07-15 13:47:46.054425] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:06.856 [2024-07-15 13:47:46.054512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:06.856 [2024-07-15 13:47:46.054516] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:06.856 [2024-07-15 13:47:46.125967] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:07.423 13:47:46 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:07.423 13:47:46 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:29:07.423 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:29:07.423 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:29:07.423 13:47:46 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:07.423 13:47:46 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:07.423 13:47:46 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:07.423 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:29:07.423 "name": "app_thread", 00:29:07.423 "id": 1, 00:29:07.423 "active_pollers": [], 00:29:07.423 "timed_pollers": [ 00:29:07.423 { 00:29:07.423 "name": "rpc_subsystem_poll_servers", 00:29:07.423 "id": 1, 00:29:07.423 "state": "waiting", 00:29:07.423 "run_count": 0, 00:29:07.423 "busy_count": 0, 00:29:07.423 "period_ticks": 9200000 00:29:07.423 } 00:29:07.423 ], 00:29:07.423 "paused_pollers": [] 00:29:07.423 }' 00:29:07.423 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:07.681 5000+0 records in 00:29:07.681 5000+0 records out 00:29:07.681 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0265391 s, 386 MB/s 00:29:07.681 13:47:46 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:07.939 AIO0 00:29:07.939 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:08.199 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:29:08.199 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:29:08.199 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:29:08.199 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:08.199 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:08.199 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:08.199 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:29:08.199 "name": "app_thread", 00:29:08.199 "id": 1, 00:29:08.199 "active_pollers": [], 00:29:08.199 "timed_pollers": [ 00:29:08.199 { 00:29:08.199 "name": "rpc_subsystem_poll_servers", 00:29:08.199 "id": 1, 00:29:08.199 "state": "waiting", 00:29:08.199 "run_count": 0, 00:29:08.199 "busy_count": 0, 00:29:08.199 "period_ticks": 9200000 00:29:08.199 } 00:29:08.199 ], 00:29:08.199 "paused_pollers": [] 00:29:08.199 }' 00:29:08.199 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:29:08.457 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2234661 00:29:08.457 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2234661 ']' 00:29:08.457 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2234661 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2234661 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2234661' 00:29:08.458 killing process with pid 2234661 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2234661 00:29:08.458 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2234661 00:29:08.715 13:47:47 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:29:08.715 13:47:47 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:08.715 00:29:08.715 real 0m2.437s 00:29:08.715 user 0m1.555s 00:29:08.715 sys 0m0.638s 00:29:08.715 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:08.715 13:47:47 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:08.715 ************************************ 00:29:08.715 END TEST reap_unregistered_poller 00:29:08.715 ************************************ 00:29:08.715 13:47:47 -- common/autotest_common.sh@1142 -- # return 0 00:29:08.715 13:47:47 -- spdk/autotest.sh@198 -- # uname -s 00:29:08.715 13:47:48 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:29:08.715 13:47:48 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:29:08.715 13:47:48 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:29:08.715 13:47:48 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@260 -- # timing_exit lib 00:29:08.715 13:47:48 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:08.715 13:47:48 -- common/autotest_common.sh@10 -- # set +x 00:29:08.715 13:47:48 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:29:08.715 13:47:48 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:08.715 13:47:48 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:08.715 13:47:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:08.715 13:47:48 -- common/autotest_common.sh@10 -- # set +x 00:29:08.715 ************************************ 00:29:08.715 START TEST compress_compdev 00:29:08.715 ************************************ 00:29:08.715 13:47:48 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:08.973 * Looking for test storage... 00:29:08.973 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:08.973 13:47:48 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:08.973 13:47:48 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:08.974 13:47:48 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:08.974 13:47:48 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:08.974 13:47:48 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:08.974 13:47:48 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:08.974 13:47:48 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:08.974 13:47:48 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:08.974 13:47:48 compress_compdev -- paths/export.sh@5 -- # export PATH 00:29:08.974 13:47:48 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:08.974 13:47:48 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2235083 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2235083 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2235083 ']' 00:29:08.974 13:47:48 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:08.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:08.974 13:47:48 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:08.974 [2024-07-15 13:47:48.267595] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:08.974 [2024-07-15 13:47:48.267666] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2235083 ] 00:29:08.974 [2024-07-15 13:47:48.386623] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:09.232 [2024-07-15 13:47:48.489756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:09.232 [2024-07-15 13:47:48.489762] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:09.828 [2024-07-15 13:47:49.228409] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:10.086 13:47:49 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:10.086 13:47:49 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:10.086 13:47:49 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:29:10.086 13:47:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:10.086 13:47:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:10.650 [2024-07-15 13:47:49.863201] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20ee3c0 PMD being used: compress_qat 00:29:10.650 13:47:49 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:10.650 13:47:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:10.907 13:47:50 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:11.164 [ 00:29:11.164 { 00:29:11.164 "name": "Nvme0n1", 00:29:11.164 "aliases": [ 00:29:11.164 "01000000-0000-0000-5cd2-e43197705251" 00:29:11.164 ], 00:29:11.164 "product_name": "NVMe disk", 00:29:11.164 "block_size": 512, 00:29:11.164 "num_blocks": 15002931888, 00:29:11.164 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:11.164 "assigned_rate_limits": { 00:29:11.164 "rw_ios_per_sec": 0, 00:29:11.164 "rw_mbytes_per_sec": 0, 00:29:11.164 "r_mbytes_per_sec": 0, 00:29:11.164 "w_mbytes_per_sec": 0 00:29:11.164 }, 00:29:11.164 "claimed": false, 00:29:11.164 "zoned": false, 00:29:11.164 "supported_io_types": { 00:29:11.164 "read": true, 00:29:11.164 "write": true, 00:29:11.164 "unmap": true, 00:29:11.164 "flush": true, 00:29:11.164 "reset": true, 00:29:11.164 "nvme_admin": true, 00:29:11.164 "nvme_io": true, 00:29:11.164 "nvme_io_md": false, 00:29:11.164 "write_zeroes": true, 00:29:11.164 "zcopy": false, 00:29:11.164 "get_zone_info": false, 00:29:11.164 "zone_management": false, 00:29:11.164 "zone_append": false, 00:29:11.164 "compare": false, 00:29:11.164 "compare_and_write": false, 00:29:11.164 "abort": true, 00:29:11.164 "seek_hole": false, 00:29:11.164 "seek_data": false, 00:29:11.164 "copy": false, 00:29:11.164 "nvme_iov_md": false 00:29:11.164 }, 00:29:11.164 "driver_specific": { 00:29:11.164 "nvme": [ 00:29:11.164 { 00:29:11.164 "pci_address": "0000:5e:00.0", 00:29:11.164 "trid": { 00:29:11.164 "trtype": "PCIe", 00:29:11.164 "traddr": "0000:5e:00.0" 00:29:11.164 }, 00:29:11.164 "ctrlr_data": { 00:29:11.164 "cntlid": 0, 00:29:11.164 "vendor_id": "0x8086", 00:29:11.165 "model_number": "INTEL SSDPF2KX076TZO", 00:29:11.165 "serial_number": "PHAC0301002G7P6CGN", 00:29:11.165 "firmware_revision": "JCV10200", 00:29:11.165 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:11.165 "oacs": { 00:29:11.165 "security": 1, 00:29:11.165 "format": 1, 00:29:11.165 "firmware": 1, 00:29:11.165 "ns_manage": 1 00:29:11.165 }, 00:29:11.165 "multi_ctrlr": false, 00:29:11.165 "ana_reporting": false 00:29:11.165 }, 00:29:11.165 "vs": { 00:29:11.165 "nvme_version": "1.3" 00:29:11.165 }, 00:29:11.165 "ns_data": { 00:29:11.165 "id": 1, 00:29:11.165 "can_share": false 00:29:11.165 }, 00:29:11.165 "security": { 00:29:11.165 "opal": true 00:29:11.165 } 00:29:11.165 } 00:29:11.165 ], 00:29:11.165 "mp_policy": "active_passive" 00:29:11.165 } 00:29:11.165 } 00:29:11.165 ] 00:29:11.165 13:47:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:11.165 13:47:50 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:11.422 [2024-07-15 13:47:50.604775] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f530d0 PMD being used: compress_qat 00:29:13.948 34e1b390-273c-4de5-9fdb-1f3711d94a52 00:29:13.948 13:47:52 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:13.948 fb627f94-a70d-429f-8806-8b432fc66c53 00:29:13.948 13:47:53 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:13.948 13:47:53 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:14.205 [ 00:29:14.205 { 00:29:14.205 "name": "fb627f94-a70d-429f-8806-8b432fc66c53", 00:29:14.205 "aliases": [ 00:29:14.205 "lvs0/lv0" 00:29:14.205 ], 00:29:14.205 "product_name": "Logical Volume", 00:29:14.205 "block_size": 512, 00:29:14.205 "num_blocks": 204800, 00:29:14.205 "uuid": "fb627f94-a70d-429f-8806-8b432fc66c53", 00:29:14.205 "assigned_rate_limits": { 00:29:14.205 "rw_ios_per_sec": 0, 00:29:14.205 "rw_mbytes_per_sec": 0, 00:29:14.205 "r_mbytes_per_sec": 0, 00:29:14.205 "w_mbytes_per_sec": 0 00:29:14.205 }, 00:29:14.205 "claimed": false, 00:29:14.205 "zoned": false, 00:29:14.205 "supported_io_types": { 00:29:14.205 "read": true, 00:29:14.205 "write": true, 00:29:14.205 "unmap": true, 00:29:14.205 "flush": false, 00:29:14.205 "reset": true, 00:29:14.205 "nvme_admin": false, 00:29:14.205 "nvme_io": false, 00:29:14.205 "nvme_io_md": false, 00:29:14.205 "write_zeroes": true, 00:29:14.205 "zcopy": false, 00:29:14.205 "get_zone_info": false, 00:29:14.205 "zone_management": false, 00:29:14.205 "zone_append": false, 00:29:14.205 "compare": false, 00:29:14.205 "compare_and_write": false, 00:29:14.205 "abort": false, 00:29:14.205 "seek_hole": true, 00:29:14.205 "seek_data": true, 00:29:14.205 "copy": false, 00:29:14.205 "nvme_iov_md": false 00:29:14.205 }, 00:29:14.205 "driver_specific": { 00:29:14.205 "lvol": { 00:29:14.205 "lvol_store_uuid": "34e1b390-273c-4de5-9fdb-1f3711d94a52", 00:29:14.205 "base_bdev": "Nvme0n1", 00:29:14.205 "thin_provision": true, 00:29:14.205 "num_allocated_clusters": 0, 00:29:14.205 "snapshot": false, 00:29:14.205 "clone": false, 00:29:14.205 "esnap_clone": false 00:29:14.205 } 00:29:14.205 } 00:29:14.205 } 00:29:14.205 ] 00:29:14.205 13:47:53 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:14.205 13:47:53 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:14.205 13:47:53 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:14.463 [2024-07-15 13:47:53.799388] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:14.463 COMP_lvs0/lv0 00:29:14.463 13:47:53 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:14.463 13:47:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:14.721 13:47:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:14.979 [ 00:29:14.979 { 00:29:14.979 "name": "COMP_lvs0/lv0", 00:29:14.979 "aliases": [ 00:29:14.979 "1655b596-6044-524c-a7a9-9887e815de1e" 00:29:14.979 ], 00:29:14.979 "product_name": "compress", 00:29:14.979 "block_size": 512, 00:29:14.979 "num_blocks": 200704, 00:29:14.979 "uuid": "1655b596-6044-524c-a7a9-9887e815de1e", 00:29:14.979 "assigned_rate_limits": { 00:29:14.979 "rw_ios_per_sec": 0, 00:29:14.979 "rw_mbytes_per_sec": 0, 00:29:14.979 "r_mbytes_per_sec": 0, 00:29:14.979 "w_mbytes_per_sec": 0 00:29:14.979 }, 00:29:14.979 "claimed": false, 00:29:14.979 "zoned": false, 00:29:14.979 "supported_io_types": { 00:29:14.979 "read": true, 00:29:14.979 "write": true, 00:29:14.979 "unmap": false, 00:29:14.979 "flush": false, 00:29:14.979 "reset": false, 00:29:14.979 "nvme_admin": false, 00:29:14.979 "nvme_io": false, 00:29:14.979 "nvme_io_md": false, 00:29:14.979 "write_zeroes": true, 00:29:14.979 "zcopy": false, 00:29:14.979 "get_zone_info": false, 00:29:14.979 "zone_management": false, 00:29:14.979 "zone_append": false, 00:29:14.979 "compare": false, 00:29:14.979 "compare_and_write": false, 00:29:14.979 "abort": false, 00:29:14.979 "seek_hole": false, 00:29:14.979 "seek_data": false, 00:29:14.979 "copy": false, 00:29:14.979 "nvme_iov_md": false 00:29:14.979 }, 00:29:14.979 "driver_specific": { 00:29:14.979 "compress": { 00:29:14.979 "name": "COMP_lvs0/lv0", 00:29:14.979 "base_bdev_name": "fb627f94-a70d-429f-8806-8b432fc66c53" 00:29:14.979 } 00:29:14.979 } 00:29:14.979 } 00:29:14.979 ] 00:29:14.979 13:47:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:14.979 13:47:54 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:15.237 [2024-07-15 13:47:54.405785] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f17281b15c0 PMD being used: compress_qat 00:29:15.237 [2024-07-15 13:47:54.408031] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20eb670 PMD being used: compress_qat 00:29:15.237 Running I/O for 3 seconds... 00:29:18.513 00:29:18.513 Latency(us) 00:29:18.513 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:18.513 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:18.513 Verification LBA range: start 0x0 length 0x3100 00:29:18.513 COMP_lvs0/lv0 : 3.00 5150.19 20.12 0.00 0.00 6160.66 537.82 5698.78 00:29:18.513 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:18.513 Verification LBA range: start 0x3100 length 0x3100 00:29:18.513 COMP_lvs0/lv0 : 3.00 5418.41 21.17 0.00 0.00 5869.20 379.33 5613.30 00:29:18.513 =================================================================================================================== 00:29:18.513 Total : 10568.60 41.28 0.00 0.00 6011.25 379.33 5698.78 00:29:18.513 0 00:29:18.513 13:47:57 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:18.513 13:47:57 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:18.513 13:47:57 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:18.513 13:47:57 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:18.514 13:47:57 compress_compdev -- compress/compress.sh@78 -- # killprocess 2235083 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2235083 ']' 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2235083 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2235083 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2235083' 00:29:18.514 killing process with pid 2235083 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@967 -- # kill 2235083 00:29:18.514 Received shutdown signal, test time was about 3.000000 seconds 00:29:18.514 00:29:18.514 Latency(us) 00:29:18.514 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:18.514 =================================================================================================================== 00:29:18.514 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:18.514 13:47:57 compress_compdev -- common/autotest_common.sh@972 -- # wait 2235083 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2236759 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:21.793 13:48:00 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2236759 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2236759 ']' 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:21.793 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:21.793 13:48:00 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:21.793 [2024-07-15 13:48:01.001004] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:21.793 [2024-07-15 13:48:01.001080] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2236759 ] 00:29:21.793 [2024-07-15 13:48:01.121684] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:22.050 [2024-07-15 13:48:01.228147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:22.050 [2024-07-15 13:48:01.228150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:22.615 [2024-07-15 13:48:01.982577] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:22.872 13:48:02 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:22.872 13:48:02 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:22.872 13:48:02 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:29:22.872 13:48:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:22.872 13:48:02 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:23.438 [2024-07-15 13:48:02.622962] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ce23c0 PMD being used: compress_qat 00:29:23.438 13:48:02 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:23.438 13:48:02 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:23.695 13:48:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:23.953 [ 00:29:23.953 { 00:29:23.953 "name": "Nvme0n1", 00:29:23.953 "aliases": [ 00:29:23.953 "01000000-0000-0000-5cd2-e43197705251" 00:29:23.953 ], 00:29:23.953 "product_name": "NVMe disk", 00:29:23.953 "block_size": 512, 00:29:23.953 "num_blocks": 15002931888, 00:29:23.953 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:23.953 "assigned_rate_limits": { 00:29:23.953 "rw_ios_per_sec": 0, 00:29:23.953 "rw_mbytes_per_sec": 0, 00:29:23.953 "r_mbytes_per_sec": 0, 00:29:23.953 "w_mbytes_per_sec": 0 00:29:23.953 }, 00:29:23.953 "claimed": false, 00:29:23.953 "zoned": false, 00:29:23.953 "supported_io_types": { 00:29:23.953 "read": true, 00:29:23.953 "write": true, 00:29:23.953 "unmap": true, 00:29:23.953 "flush": true, 00:29:23.953 "reset": true, 00:29:23.953 "nvme_admin": true, 00:29:23.953 "nvme_io": true, 00:29:23.953 "nvme_io_md": false, 00:29:23.953 "write_zeroes": true, 00:29:23.953 "zcopy": false, 00:29:23.953 "get_zone_info": false, 00:29:23.953 "zone_management": false, 00:29:23.953 "zone_append": false, 00:29:23.953 "compare": false, 00:29:23.953 "compare_and_write": false, 00:29:23.953 "abort": true, 00:29:23.953 "seek_hole": false, 00:29:23.953 "seek_data": false, 00:29:23.953 "copy": false, 00:29:23.953 "nvme_iov_md": false 00:29:23.953 }, 00:29:23.953 "driver_specific": { 00:29:23.953 "nvme": [ 00:29:23.953 { 00:29:23.953 "pci_address": "0000:5e:00.0", 00:29:23.953 "trid": { 00:29:23.953 "trtype": "PCIe", 00:29:23.953 "traddr": "0000:5e:00.0" 00:29:23.953 }, 00:29:23.953 "ctrlr_data": { 00:29:23.953 "cntlid": 0, 00:29:23.953 "vendor_id": "0x8086", 00:29:23.953 "model_number": "INTEL SSDPF2KX076TZO", 00:29:23.953 "serial_number": "PHAC0301002G7P6CGN", 00:29:23.953 "firmware_revision": "JCV10200", 00:29:23.953 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:23.953 "oacs": { 00:29:23.953 "security": 1, 00:29:23.953 "format": 1, 00:29:23.953 "firmware": 1, 00:29:23.953 "ns_manage": 1 00:29:23.953 }, 00:29:23.953 "multi_ctrlr": false, 00:29:23.953 "ana_reporting": false 00:29:23.953 }, 00:29:23.953 "vs": { 00:29:23.953 "nvme_version": "1.3" 00:29:23.953 }, 00:29:23.953 "ns_data": { 00:29:23.953 "id": 1, 00:29:23.953 "can_share": false 00:29:23.953 }, 00:29:23.953 "security": { 00:29:23.953 "opal": true 00:29:23.953 } 00:29:23.953 } 00:29:23.953 ], 00:29:23.953 "mp_policy": "active_passive" 00:29:23.953 } 00:29:23.953 } 00:29:23.953 ] 00:29:23.953 13:48:03 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:23.953 13:48:03 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:24.211 [2024-07-15 13:48:03.400841] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b470d0 PMD being used: compress_qat 00:29:26.735 783cf3ef-e122-4083-b570-70419b3fa403 00:29:26.735 13:48:05 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:26.735 6321fad9-fcdc-45ab-981b-a7da23325b13 00:29:26.735 13:48:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:26.735 13:48:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:26.735 13:48:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:26.992 [ 00:29:26.992 { 00:29:26.992 "name": "6321fad9-fcdc-45ab-981b-a7da23325b13", 00:29:26.992 "aliases": [ 00:29:26.992 "lvs0/lv0" 00:29:26.992 ], 00:29:26.992 "product_name": "Logical Volume", 00:29:26.992 "block_size": 512, 00:29:26.992 "num_blocks": 204800, 00:29:26.992 "uuid": "6321fad9-fcdc-45ab-981b-a7da23325b13", 00:29:26.992 "assigned_rate_limits": { 00:29:26.992 "rw_ios_per_sec": 0, 00:29:26.992 "rw_mbytes_per_sec": 0, 00:29:26.992 "r_mbytes_per_sec": 0, 00:29:26.992 "w_mbytes_per_sec": 0 00:29:26.992 }, 00:29:26.992 "claimed": false, 00:29:26.992 "zoned": false, 00:29:26.992 "supported_io_types": { 00:29:26.992 "read": true, 00:29:26.992 "write": true, 00:29:26.992 "unmap": true, 00:29:26.992 "flush": false, 00:29:26.992 "reset": true, 00:29:26.992 "nvme_admin": false, 00:29:26.992 "nvme_io": false, 00:29:26.992 "nvme_io_md": false, 00:29:26.992 "write_zeroes": true, 00:29:26.992 "zcopy": false, 00:29:26.992 "get_zone_info": false, 00:29:26.992 "zone_management": false, 00:29:26.992 "zone_append": false, 00:29:26.992 "compare": false, 00:29:26.992 "compare_and_write": false, 00:29:26.992 "abort": false, 00:29:26.992 "seek_hole": true, 00:29:26.992 "seek_data": true, 00:29:26.992 "copy": false, 00:29:26.992 "nvme_iov_md": false 00:29:26.992 }, 00:29:26.992 "driver_specific": { 00:29:26.992 "lvol": { 00:29:26.992 "lvol_store_uuid": "783cf3ef-e122-4083-b570-70419b3fa403", 00:29:26.992 "base_bdev": "Nvme0n1", 00:29:26.992 "thin_provision": true, 00:29:26.992 "num_allocated_clusters": 0, 00:29:26.992 "snapshot": false, 00:29:26.992 "clone": false, 00:29:26.992 "esnap_clone": false 00:29:26.992 } 00:29:26.992 } 00:29:26.992 } 00:29:26.992 ] 00:29:26.992 13:48:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:26.992 13:48:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:26.992 13:48:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:27.250 [2024-07-15 13:48:06.615327] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:27.250 COMP_lvs0/lv0 00:29:27.250 13:48:06 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:27.250 13:48:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:27.508 13:48:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:27.765 [ 00:29:27.765 { 00:29:27.765 "name": "COMP_lvs0/lv0", 00:29:27.765 "aliases": [ 00:29:27.765 "0559042d-7d03-5026-9e3d-b657188e87c9" 00:29:27.765 ], 00:29:27.765 "product_name": "compress", 00:29:27.765 "block_size": 512, 00:29:27.765 "num_blocks": 200704, 00:29:27.765 "uuid": "0559042d-7d03-5026-9e3d-b657188e87c9", 00:29:27.765 "assigned_rate_limits": { 00:29:27.765 "rw_ios_per_sec": 0, 00:29:27.765 "rw_mbytes_per_sec": 0, 00:29:27.765 "r_mbytes_per_sec": 0, 00:29:27.765 "w_mbytes_per_sec": 0 00:29:27.765 }, 00:29:27.765 "claimed": false, 00:29:27.765 "zoned": false, 00:29:27.765 "supported_io_types": { 00:29:27.765 "read": true, 00:29:27.765 "write": true, 00:29:27.765 "unmap": false, 00:29:27.765 "flush": false, 00:29:27.765 "reset": false, 00:29:27.765 "nvme_admin": false, 00:29:27.765 "nvme_io": false, 00:29:27.765 "nvme_io_md": false, 00:29:27.765 "write_zeroes": true, 00:29:27.765 "zcopy": false, 00:29:27.765 "get_zone_info": false, 00:29:27.765 "zone_management": false, 00:29:27.765 "zone_append": false, 00:29:27.765 "compare": false, 00:29:27.765 "compare_and_write": false, 00:29:27.765 "abort": false, 00:29:27.765 "seek_hole": false, 00:29:27.765 "seek_data": false, 00:29:27.765 "copy": false, 00:29:27.765 "nvme_iov_md": false 00:29:27.765 }, 00:29:27.765 "driver_specific": { 00:29:27.765 "compress": { 00:29:27.765 "name": "COMP_lvs0/lv0", 00:29:27.765 "base_bdev_name": "6321fad9-fcdc-45ab-981b-a7da23325b13" 00:29:27.765 } 00:29:27.765 } 00:29:27.765 } 00:29:27.765 ] 00:29:27.765 13:48:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:27.765 13:48:07 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:28.024 [2024-07-15 13:48:07.225703] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb9941b15c0 PMD being used: compress_qat 00:29:28.024 [2024-07-15 13:48:07.227954] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cdf700 PMD being used: compress_qat 00:29:28.024 Running I/O for 3 seconds... 00:29:31.341 00:29:31.341 Latency(us) 00:29:31.341 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:31.341 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:31.341 Verification LBA range: start 0x0 length 0x3100 00:29:31.341 COMP_lvs0/lv0 : 3.00 5146.47 20.10 0.00 0.00 6167.47 473.71 7038.00 00:29:31.341 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:31.341 Verification LBA range: start 0x3100 length 0x3100 00:29:31.341 COMP_lvs0/lv0 : 3.00 5415.47 21.15 0.00 0.00 5873.40 336.58 6924.02 00:29:31.341 =================================================================================================================== 00:29:31.341 Total : 10561.94 41.26 0.00 0.00 6016.69 336.58 7038.00 00:29:31.341 0 00:29:31.341 13:48:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:31.341 13:48:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:31.341 13:48:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:31.341 13:48:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:31.341 13:48:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 2236759 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2236759 ']' 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2236759 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2236759 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2236759' 00:29:31.341 killing process with pid 2236759 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@967 -- # kill 2236759 00:29:31.341 Received shutdown signal, test time was about 3.000000 seconds 00:29:31.341 00:29:31.341 Latency(us) 00:29:31.341 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:31.341 =================================================================================================================== 00:29:31.341 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:31.341 13:48:10 compress_compdev -- common/autotest_common.sh@972 -- # wait 2236759 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2238818 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:34.620 13:48:13 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2238818 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2238818 ']' 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:34.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:34.620 13:48:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:34.620 [2024-07-15 13:48:13.820118] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:34.620 [2024-07-15 13:48:13.820191] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2238818 ] 00:29:34.620 [2024-07-15 13:48:13.939224] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:34.620 [2024-07-15 13:48:14.041408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:34.620 [2024-07-15 13:48:14.041415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:35.555 [2024-07-15 13:48:14.784780] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:35.555 13:48:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:35.555 13:48:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:35.555 13:48:14 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:29:35.555 13:48:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:35.555 13:48:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:36.120 [2024-07-15 13:48:15.422823] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13363c0 PMD being used: compress_qat 00:29:36.120 13:48:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:36.120 13:48:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:36.377 13:48:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:36.635 [ 00:29:36.635 { 00:29:36.635 "name": "Nvme0n1", 00:29:36.635 "aliases": [ 00:29:36.635 "01000000-0000-0000-5cd2-e43197705251" 00:29:36.635 ], 00:29:36.635 "product_name": "NVMe disk", 00:29:36.635 "block_size": 512, 00:29:36.635 "num_blocks": 15002931888, 00:29:36.635 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:36.635 "assigned_rate_limits": { 00:29:36.635 "rw_ios_per_sec": 0, 00:29:36.635 "rw_mbytes_per_sec": 0, 00:29:36.635 "r_mbytes_per_sec": 0, 00:29:36.635 "w_mbytes_per_sec": 0 00:29:36.635 }, 00:29:36.635 "claimed": false, 00:29:36.635 "zoned": false, 00:29:36.635 "supported_io_types": { 00:29:36.635 "read": true, 00:29:36.635 "write": true, 00:29:36.635 "unmap": true, 00:29:36.635 "flush": true, 00:29:36.635 "reset": true, 00:29:36.635 "nvme_admin": true, 00:29:36.635 "nvme_io": true, 00:29:36.635 "nvme_io_md": false, 00:29:36.635 "write_zeroes": true, 00:29:36.635 "zcopy": false, 00:29:36.635 "get_zone_info": false, 00:29:36.635 "zone_management": false, 00:29:36.635 "zone_append": false, 00:29:36.635 "compare": false, 00:29:36.635 "compare_and_write": false, 00:29:36.635 "abort": true, 00:29:36.635 "seek_hole": false, 00:29:36.635 "seek_data": false, 00:29:36.635 "copy": false, 00:29:36.635 "nvme_iov_md": false 00:29:36.635 }, 00:29:36.635 "driver_specific": { 00:29:36.635 "nvme": [ 00:29:36.635 { 00:29:36.635 "pci_address": "0000:5e:00.0", 00:29:36.635 "trid": { 00:29:36.635 "trtype": "PCIe", 00:29:36.635 "traddr": "0000:5e:00.0" 00:29:36.635 }, 00:29:36.635 "ctrlr_data": { 00:29:36.635 "cntlid": 0, 00:29:36.635 "vendor_id": "0x8086", 00:29:36.635 "model_number": "INTEL SSDPF2KX076TZO", 00:29:36.635 "serial_number": "PHAC0301002G7P6CGN", 00:29:36.635 "firmware_revision": "JCV10200", 00:29:36.635 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:36.635 "oacs": { 00:29:36.635 "security": 1, 00:29:36.635 "format": 1, 00:29:36.635 "firmware": 1, 00:29:36.635 "ns_manage": 1 00:29:36.635 }, 00:29:36.635 "multi_ctrlr": false, 00:29:36.635 "ana_reporting": false 00:29:36.635 }, 00:29:36.635 "vs": { 00:29:36.635 "nvme_version": "1.3" 00:29:36.635 }, 00:29:36.635 "ns_data": { 00:29:36.635 "id": 1, 00:29:36.635 "can_share": false 00:29:36.635 }, 00:29:36.635 "security": { 00:29:36.635 "opal": true 00:29:36.635 } 00:29:36.635 } 00:29:36.635 ], 00:29:36.635 "mp_policy": "active_passive" 00:29:36.635 } 00:29:36.635 } 00:29:36.635 ] 00:29:36.635 13:48:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:36.635 13:48:15 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:36.892 [2024-07-15 13:48:16.164346] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x119b660 PMD being used: compress_qat 00:29:39.418 3d650203-d1ce-4257-bb83-20682af96b19 00:29:39.418 13:48:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:39.418 b0a67423-dcb5-48fa-ab38-c49335739fa3 00:29:39.418 13:48:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:39.418 13:48:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:39.677 [ 00:29:39.677 { 00:29:39.677 "name": "b0a67423-dcb5-48fa-ab38-c49335739fa3", 00:29:39.677 "aliases": [ 00:29:39.677 "lvs0/lv0" 00:29:39.677 ], 00:29:39.677 "product_name": "Logical Volume", 00:29:39.677 "block_size": 512, 00:29:39.677 "num_blocks": 204800, 00:29:39.677 "uuid": "b0a67423-dcb5-48fa-ab38-c49335739fa3", 00:29:39.677 "assigned_rate_limits": { 00:29:39.677 "rw_ios_per_sec": 0, 00:29:39.677 "rw_mbytes_per_sec": 0, 00:29:39.677 "r_mbytes_per_sec": 0, 00:29:39.677 "w_mbytes_per_sec": 0 00:29:39.677 }, 00:29:39.677 "claimed": false, 00:29:39.677 "zoned": false, 00:29:39.677 "supported_io_types": { 00:29:39.677 "read": true, 00:29:39.677 "write": true, 00:29:39.677 "unmap": true, 00:29:39.677 "flush": false, 00:29:39.677 "reset": true, 00:29:39.677 "nvme_admin": false, 00:29:39.677 "nvme_io": false, 00:29:39.677 "nvme_io_md": false, 00:29:39.677 "write_zeroes": true, 00:29:39.677 "zcopy": false, 00:29:39.677 "get_zone_info": false, 00:29:39.677 "zone_management": false, 00:29:39.677 "zone_append": false, 00:29:39.677 "compare": false, 00:29:39.677 "compare_and_write": false, 00:29:39.677 "abort": false, 00:29:39.677 "seek_hole": true, 00:29:39.677 "seek_data": true, 00:29:39.677 "copy": false, 00:29:39.677 "nvme_iov_md": false 00:29:39.677 }, 00:29:39.677 "driver_specific": { 00:29:39.677 "lvol": { 00:29:39.677 "lvol_store_uuid": "3d650203-d1ce-4257-bb83-20682af96b19", 00:29:39.677 "base_bdev": "Nvme0n1", 00:29:39.677 "thin_provision": true, 00:29:39.677 "num_allocated_clusters": 0, 00:29:39.677 "snapshot": false, 00:29:39.677 "clone": false, 00:29:39.677 "esnap_clone": false 00:29:39.677 } 00:29:39.677 } 00:29:39.677 } 00:29:39.677 ] 00:29:39.677 13:48:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:39.677 13:48:19 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:29:39.677 13:48:19 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:29:39.935 [2024-07-15 13:48:19.294849] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:39.935 COMP_lvs0/lv0 00:29:39.935 13:48:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:39.935 13:48:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:40.192 13:48:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:40.450 [ 00:29:40.450 { 00:29:40.450 "name": "COMP_lvs0/lv0", 00:29:40.450 "aliases": [ 00:29:40.450 "059591dc-985a-5b93-b4c8-bbe5c2d2bc3c" 00:29:40.450 ], 00:29:40.450 "product_name": "compress", 00:29:40.450 "block_size": 4096, 00:29:40.450 "num_blocks": 25088, 00:29:40.450 "uuid": "059591dc-985a-5b93-b4c8-bbe5c2d2bc3c", 00:29:40.450 "assigned_rate_limits": { 00:29:40.450 "rw_ios_per_sec": 0, 00:29:40.450 "rw_mbytes_per_sec": 0, 00:29:40.450 "r_mbytes_per_sec": 0, 00:29:40.450 "w_mbytes_per_sec": 0 00:29:40.450 }, 00:29:40.450 "claimed": false, 00:29:40.450 "zoned": false, 00:29:40.450 "supported_io_types": { 00:29:40.450 "read": true, 00:29:40.450 "write": true, 00:29:40.450 "unmap": false, 00:29:40.450 "flush": false, 00:29:40.450 "reset": false, 00:29:40.450 "nvme_admin": false, 00:29:40.450 "nvme_io": false, 00:29:40.450 "nvme_io_md": false, 00:29:40.450 "write_zeroes": true, 00:29:40.450 "zcopy": false, 00:29:40.450 "get_zone_info": false, 00:29:40.450 "zone_management": false, 00:29:40.450 "zone_append": false, 00:29:40.450 "compare": false, 00:29:40.450 "compare_and_write": false, 00:29:40.450 "abort": false, 00:29:40.450 "seek_hole": false, 00:29:40.450 "seek_data": false, 00:29:40.450 "copy": false, 00:29:40.450 "nvme_iov_md": false 00:29:40.450 }, 00:29:40.450 "driver_specific": { 00:29:40.450 "compress": { 00:29:40.450 "name": "COMP_lvs0/lv0", 00:29:40.450 "base_bdev_name": "b0a67423-dcb5-48fa-ab38-c49335739fa3" 00:29:40.450 } 00:29:40.450 } 00:29:40.450 } 00:29:40.450 ] 00:29:40.450 13:48:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:40.450 13:48:19 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:40.707 [2024-07-15 13:48:19.921254] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8a6c1b15c0 PMD being used: compress_qat 00:29:40.708 [2024-07-15 13:48:19.923469] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1333770 PMD being used: compress_qat 00:29:40.708 Running I/O for 3 seconds... 00:29:43.995 00:29:43.995 Latency(us) 00:29:43.995 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:43.995 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:43.995 Verification LBA range: start 0x0 length 0x3100 00:29:43.995 COMP_lvs0/lv0 : 3.00 5089.06 19.88 0.00 0.00 6236.97 512.89 7522.39 00:29:43.995 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:43.995 Verification LBA range: start 0x3100 length 0x3100 00:29:43.995 COMP_lvs0/lv0 : 3.00 5328.73 20.82 0.00 0.00 5968.01 382.89 7265.95 00:29:43.995 =================================================================================================================== 00:29:43.995 Total : 10417.79 40.69 0.00 0.00 6099.39 382.89 7522.39 00:29:43.995 0 00:29:43.995 13:48:22 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:43.995 13:48:22 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:43.995 13:48:23 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:44.255 13:48:23 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:44.255 13:48:23 compress_compdev -- compress/compress.sh@78 -- # killprocess 2238818 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2238818 ']' 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2238818 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2238818 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2238818' 00:29:44.255 killing process with pid 2238818 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@967 -- # kill 2238818 00:29:44.255 Received shutdown signal, test time was about 3.000000 seconds 00:29:44.255 00:29:44.255 Latency(us) 00:29:44.255 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:44.255 =================================================================================================================== 00:29:44.255 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:44.255 13:48:23 compress_compdev -- common/autotest_common.sh@972 -- # wait 2238818 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2240418 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:29:47.532 13:48:26 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2240418 00:29:47.532 13:48:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2240418 ']' 00:29:47.532 13:48:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:47.532 13:48:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:47.533 13:48:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:47.533 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:47.533 13:48:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:47.533 13:48:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:47.533 [2024-07-15 13:48:26.569607] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:47.533 [2024-07-15 13:48:26.569679] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2240418 ] 00:29:47.533 [2024-07-15 13:48:26.700442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:47.533 [2024-07-15 13:48:26.808935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:47.533 [2024-07-15 13:48:26.809007] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:47.533 [2024-07-15 13:48:26.809012] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:48.483 [2024-07-15 13:48:27.549136] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:48.483 13:48:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:48.483 13:48:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:48.483 13:48:27 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:29:48.483 13:48:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:48.483 13:48:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:49.048 [2024-07-15 13:48:28.183487] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x277df20 PMD being used: compress_qat 00:29:49.048 13:48:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:49.048 13:48:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:49.306 [ 00:29:49.306 { 00:29:49.306 "name": "Nvme0n1", 00:29:49.306 "aliases": [ 00:29:49.306 "01000000-0000-0000-5cd2-e43197705251" 00:29:49.306 ], 00:29:49.306 "product_name": "NVMe disk", 00:29:49.306 "block_size": 512, 00:29:49.306 "num_blocks": 15002931888, 00:29:49.306 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:49.306 "assigned_rate_limits": { 00:29:49.306 "rw_ios_per_sec": 0, 00:29:49.306 "rw_mbytes_per_sec": 0, 00:29:49.306 "r_mbytes_per_sec": 0, 00:29:49.306 "w_mbytes_per_sec": 0 00:29:49.306 }, 00:29:49.306 "claimed": false, 00:29:49.306 "zoned": false, 00:29:49.306 "supported_io_types": { 00:29:49.306 "read": true, 00:29:49.306 "write": true, 00:29:49.306 "unmap": true, 00:29:49.306 "flush": true, 00:29:49.306 "reset": true, 00:29:49.306 "nvme_admin": true, 00:29:49.306 "nvme_io": true, 00:29:49.306 "nvme_io_md": false, 00:29:49.306 "write_zeroes": true, 00:29:49.306 "zcopy": false, 00:29:49.306 "get_zone_info": false, 00:29:49.306 "zone_management": false, 00:29:49.306 "zone_append": false, 00:29:49.306 "compare": false, 00:29:49.306 "compare_and_write": false, 00:29:49.306 "abort": true, 00:29:49.306 "seek_hole": false, 00:29:49.306 "seek_data": false, 00:29:49.306 "copy": false, 00:29:49.306 "nvme_iov_md": false 00:29:49.306 }, 00:29:49.306 "driver_specific": { 00:29:49.306 "nvme": [ 00:29:49.306 { 00:29:49.306 "pci_address": "0000:5e:00.0", 00:29:49.306 "trid": { 00:29:49.306 "trtype": "PCIe", 00:29:49.306 "traddr": "0000:5e:00.0" 00:29:49.306 }, 00:29:49.306 "ctrlr_data": { 00:29:49.306 "cntlid": 0, 00:29:49.306 "vendor_id": "0x8086", 00:29:49.306 "model_number": "INTEL SSDPF2KX076TZO", 00:29:49.306 "serial_number": "PHAC0301002G7P6CGN", 00:29:49.306 "firmware_revision": "JCV10200", 00:29:49.306 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:49.306 "oacs": { 00:29:49.306 "security": 1, 00:29:49.306 "format": 1, 00:29:49.306 "firmware": 1, 00:29:49.306 "ns_manage": 1 00:29:49.306 }, 00:29:49.306 "multi_ctrlr": false, 00:29:49.306 "ana_reporting": false 00:29:49.306 }, 00:29:49.306 "vs": { 00:29:49.306 "nvme_version": "1.3" 00:29:49.306 }, 00:29:49.306 "ns_data": { 00:29:49.306 "id": 1, 00:29:49.306 "can_share": false 00:29:49.306 }, 00:29:49.306 "security": { 00:29:49.306 "opal": true 00:29:49.306 } 00:29:49.306 } 00:29:49.306 ], 00:29:49.306 "mp_policy": "active_passive" 00:29:49.306 } 00:29:49.306 } 00:29:49.306 ] 00:29:49.306 13:48:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:49.306 13:48:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:49.563 [2024-07-15 13:48:28.937062] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25cc440 PMD being used: compress_qat 00:29:52.155 aee596d4-b3ca-41cb-8109-cb9fbd0dd9f1 00:29:52.155 13:48:31 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:52.412 279dcada-1b4d-4f73-917d-ffd161fd50b8 00:29:52.412 13:48:31 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:52.412 13:48:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:52.669 13:48:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:52.927 [ 00:29:52.927 { 00:29:52.927 "name": "279dcada-1b4d-4f73-917d-ffd161fd50b8", 00:29:52.927 "aliases": [ 00:29:52.927 "lvs0/lv0" 00:29:52.927 ], 00:29:52.927 "product_name": "Logical Volume", 00:29:52.927 "block_size": 512, 00:29:52.927 "num_blocks": 204800, 00:29:52.927 "uuid": "279dcada-1b4d-4f73-917d-ffd161fd50b8", 00:29:52.927 "assigned_rate_limits": { 00:29:52.927 "rw_ios_per_sec": 0, 00:29:52.927 "rw_mbytes_per_sec": 0, 00:29:52.927 "r_mbytes_per_sec": 0, 00:29:52.927 "w_mbytes_per_sec": 0 00:29:52.927 }, 00:29:52.927 "claimed": false, 00:29:52.927 "zoned": false, 00:29:52.927 "supported_io_types": { 00:29:52.927 "read": true, 00:29:52.927 "write": true, 00:29:52.927 "unmap": true, 00:29:52.927 "flush": false, 00:29:52.927 "reset": true, 00:29:52.927 "nvme_admin": false, 00:29:52.927 "nvme_io": false, 00:29:52.927 "nvme_io_md": false, 00:29:52.927 "write_zeroes": true, 00:29:52.927 "zcopy": false, 00:29:52.927 "get_zone_info": false, 00:29:52.927 "zone_management": false, 00:29:52.927 "zone_append": false, 00:29:52.927 "compare": false, 00:29:52.927 "compare_and_write": false, 00:29:52.927 "abort": false, 00:29:52.927 "seek_hole": true, 00:29:52.927 "seek_data": true, 00:29:52.927 "copy": false, 00:29:52.927 "nvme_iov_md": false 00:29:52.927 }, 00:29:52.927 "driver_specific": { 00:29:52.927 "lvol": { 00:29:52.927 "lvol_store_uuid": "aee596d4-b3ca-41cb-8109-cb9fbd0dd9f1", 00:29:52.927 "base_bdev": "Nvme0n1", 00:29:52.927 "thin_provision": true, 00:29:52.927 "num_allocated_clusters": 0, 00:29:52.927 "snapshot": false, 00:29:52.927 "clone": false, 00:29:52.927 "esnap_clone": false 00:29:52.927 } 00:29:52.927 } 00:29:52.927 } 00:29:52.927 ] 00:29:52.927 13:48:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:52.927 13:48:32 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:52.927 13:48:32 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:52.927 [2024-07-15 13:48:32.345550] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:52.927 COMP_lvs0/lv0 00:29:53.184 13:48:32 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:53.184 13:48:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:53.450 13:48:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:53.450 [ 00:29:53.450 { 00:29:53.450 "name": "COMP_lvs0/lv0", 00:29:53.450 "aliases": [ 00:29:53.450 "e2caae6f-f83e-5407-872f-a531f737b93c" 00:29:53.450 ], 00:29:53.450 "product_name": "compress", 00:29:53.450 "block_size": 512, 00:29:53.450 "num_blocks": 200704, 00:29:53.450 "uuid": "e2caae6f-f83e-5407-872f-a531f737b93c", 00:29:53.450 "assigned_rate_limits": { 00:29:53.451 "rw_ios_per_sec": 0, 00:29:53.451 "rw_mbytes_per_sec": 0, 00:29:53.451 "r_mbytes_per_sec": 0, 00:29:53.451 "w_mbytes_per_sec": 0 00:29:53.451 }, 00:29:53.451 "claimed": false, 00:29:53.451 "zoned": false, 00:29:53.451 "supported_io_types": { 00:29:53.451 "read": true, 00:29:53.451 "write": true, 00:29:53.451 "unmap": false, 00:29:53.451 "flush": false, 00:29:53.451 "reset": false, 00:29:53.451 "nvme_admin": false, 00:29:53.451 "nvme_io": false, 00:29:53.451 "nvme_io_md": false, 00:29:53.451 "write_zeroes": true, 00:29:53.451 "zcopy": false, 00:29:53.451 "get_zone_info": false, 00:29:53.451 "zone_management": false, 00:29:53.451 "zone_append": false, 00:29:53.451 "compare": false, 00:29:53.451 "compare_and_write": false, 00:29:53.451 "abort": false, 00:29:53.451 "seek_hole": false, 00:29:53.451 "seek_data": false, 00:29:53.451 "copy": false, 00:29:53.451 "nvme_iov_md": false 00:29:53.451 }, 00:29:53.451 "driver_specific": { 00:29:53.451 "compress": { 00:29:53.451 "name": "COMP_lvs0/lv0", 00:29:53.451 "base_bdev_name": "279dcada-1b4d-4f73-917d-ffd161fd50b8" 00:29:53.451 } 00:29:53.451 } 00:29:53.451 } 00:29:53.451 ] 00:29:53.451 13:48:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:53.451 13:48:32 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:53.708 [2024-07-15 13:48:32.966423] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f4c581b1350 PMD being used: compress_qat 00:29:53.708 I/O targets: 00:29:53.708 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:29:53.708 00:29:53.708 00:29:53.708 CUnit - A unit testing framework for C - Version 2.1-3 00:29:53.708 http://cunit.sourceforge.net/ 00:29:53.708 00:29:53.708 00:29:53.708 Suite: bdevio tests on: COMP_lvs0/lv0 00:29:53.708 Test: blockdev write read block ...passed 00:29:53.708 Test: blockdev write zeroes read block ...passed 00:29:53.708 Test: blockdev write zeroes read no split ...passed 00:29:53.708 Test: blockdev write zeroes read split ...passed 00:29:53.708 Test: blockdev write zeroes read split partial ...passed 00:29:53.708 Test: blockdev reset ...[2024-07-15 13:48:33.004012] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:29:53.708 passed 00:29:53.708 Test: blockdev write read 8 blocks ...passed 00:29:53.708 Test: blockdev write read size > 128k ...passed 00:29:53.708 Test: blockdev write read invalid size ...passed 00:29:53.708 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:53.708 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:53.708 Test: blockdev write read max offset ...passed 00:29:53.708 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:53.708 Test: blockdev writev readv 8 blocks ...passed 00:29:53.708 Test: blockdev writev readv 30 x 1block ...passed 00:29:53.708 Test: blockdev writev readv block ...passed 00:29:53.708 Test: blockdev writev readv size > 128k ...passed 00:29:53.708 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:53.708 Test: blockdev comparev and writev ...passed 00:29:53.708 Test: blockdev nvme passthru rw ...passed 00:29:53.708 Test: blockdev nvme passthru vendor specific ...passed 00:29:53.708 Test: blockdev nvme admin passthru ...passed 00:29:53.708 Test: blockdev copy ...passed 00:29:53.708 00:29:53.708 Run Summary: Type Total Ran Passed Failed Inactive 00:29:53.708 suites 1 1 n/a 0 0 00:29:53.708 tests 23 23 23 0 0 00:29:53.708 asserts 130 130 130 0 n/a 00:29:53.708 00:29:53.708 Elapsed time = 0.092 seconds 00:29:53.708 0 00:29:53.708 13:48:33 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:29:53.708 13:48:33 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:53.965 13:48:33 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:54.222 13:48:33 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:29:54.222 13:48:33 compress_compdev -- compress/compress.sh@62 -- # killprocess 2240418 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2240418 ']' 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2240418 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2240418 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2240418' 00:29:54.222 killing process with pid 2240418 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@967 -- # kill 2240418 00:29:54.222 13:48:33 compress_compdev -- common/autotest_common.sh@972 -- # wait 2240418 00:29:57.500 13:48:36 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:29:57.500 13:48:36 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:29:57.500 00:29:57.500 real 0m48.514s 00:29:57.500 user 1m52.406s 00:29:57.500 sys 0m5.832s 00:29:57.500 13:48:36 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:57.500 13:48:36 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:57.500 ************************************ 00:29:57.500 END TEST compress_compdev 00:29:57.500 ************************************ 00:29:57.500 13:48:36 -- common/autotest_common.sh@1142 -- # return 0 00:29:57.500 13:48:36 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:57.500 13:48:36 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:57.500 13:48:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:57.500 13:48:36 -- common/autotest_common.sh@10 -- # set +x 00:29:57.500 ************************************ 00:29:57.500 START TEST compress_isal 00:29:57.500 ************************************ 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:57.500 * Looking for test storage... 00:29:57.500 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:57.500 13:48:36 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:57.500 13:48:36 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:57.500 13:48:36 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:57.500 13:48:36 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.500 13:48:36 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.500 13:48:36 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.500 13:48:36 compress_isal -- paths/export.sh@5 -- # export PATH 00:29:57.500 13:48:36 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@47 -- # : 0 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:57.500 13:48:36 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2241882 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2241882 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2241882 ']' 00:29:57.500 13:48:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:57.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:57.500 13:48:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:57.500 [2024-07-15 13:48:36.869095] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:29:57.500 [2024-07-15 13:48:36.869174] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2241882 ] 00:29:57.758 [2024-07-15 13:48:36.991548] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:57.758 [2024-07-15 13:48:37.090414] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:57.758 [2024-07-15 13:48:37.090418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:58.691 13:48:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:58.691 13:48:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:29:58.691 13:48:37 compress_isal -- compress/compress.sh@74 -- # create_vols 00:29:58.691 13:48:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:58.691 13:48:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:59.256 13:48:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:59.256 13:48:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:59.512 [ 00:29:59.512 { 00:29:59.512 "name": "Nvme0n1", 00:29:59.512 "aliases": [ 00:29:59.512 "01000000-0000-0000-5cd2-e43197705251" 00:29:59.512 ], 00:29:59.512 "product_name": "NVMe disk", 00:29:59.512 "block_size": 512, 00:29:59.512 "num_blocks": 15002931888, 00:29:59.512 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:59.512 "assigned_rate_limits": { 00:29:59.512 "rw_ios_per_sec": 0, 00:29:59.512 "rw_mbytes_per_sec": 0, 00:29:59.512 "r_mbytes_per_sec": 0, 00:29:59.512 "w_mbytes_per_sec": 0 00:29:59.512 }, 00:29:59.512 "claimed": false, 00:29:59.512 "zoned": false, 00:29:59.512 "supported_io_types": { 00:29:59.512 "read": true, 00:29:59.512 "write": true, 00:29:59.512 "unmap": true, 00:29:59.512 "flush": true, 00:29:59.512 "reset": true, 00:29:59.512 "nvme_admin": true, 00:29:59.512 "nvme_io": true, 00:29:59.512 "nvme_io_md": false, 00:29:59.512 "write_zeroes": true, 00:29:59.512 "zcopy": false, 00:29:59.512 "get_zone_info": false, 00:29:59.512 "zone_management": false, 00:29:59.512 "zone_append": false, 00:29:59.512 "compare": false, 00:29:59.512 "compare_and_write": false, 00:29:59.512 "abort": true, 00:29:59.512 "seek_hole": false, 00:29:59.512 "seek_data": false, 00:29:59.512 "copy": false, 00:29:59.512 "nvme_iov_md": false 00:29:59.512 }, 00:29:59.512 "driver_specific": { 00:29:59.512 "nvme": [ 00:29:59.512 { 00:29:59.512 "pci_address": "0000:5e:00.0", 00:29:59.512 "trid": { 00:29:59.512 "trtype": "PCIe", 00:29:59.512 "traddr": "0000:5e:00.0" 00:29:59.512 }, 00:29:59.512 "ctrlr_data": { 00:29:59.512 "cntlid": 0, 00:29:59.512 "vendor_id": "0x8086", 00:29:59.512 "model_number": "INTEL SSDPF2KX076TZO", 00:29:59.512 "serial_number": "PHAC0301002G7P6CGN", 00:29:59.512 "firmware_revision": "JCV10200", 00:29:59.512 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:59.512 "oacs": { 00:29:59.512 "security": 1, 00:29:59.512 "format": 1, 00:29:59.512 "firmware": 1, 00:29:59.513 "ns_manage": 1 00:29:59.513 }, 00:29:59.513 "multi_ctrlr": false, 00:29:59.513 "ana_reporting": false 00:29:59.513 }, 00:29:59.513 "vs": { 00:29:59.513 "nvme_version": "1.3" 00:29:59.513 }, 00:29:59.513 "ns_data": { 00:29:59.513 "id": 1, 00:29:59.513 "can_share": false 00:29:59.513 }, 00:29:59.513 "security": { 00:29:59.513 "opal": true 00:29:59.513 } 00:29:59.513 } 00:29:59.513 ], 00:29:59.513 "mp_policy": "active_passive" 00:29:59.513 } 00:29:59.513 } 00:29:59.513 ] 00:29:59.513 13:48:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:29:59.513 13:48:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:02.038 48c9b5b7-bd64-4a7e-8b60-b1b055da9150 00:30:02.038 13:48:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:02.295 6e99e2c2-fdda-4367-bb9b-8c4030ecfd03 00:30:02.295 13:48:41 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:02.295 13:48:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:02.554 13:48:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:02.811 [ 00:30:02.811 { 00:30:02.811 "name": "6e99e2c2-fdda-4367-bb9b-8c4030ecfd03", 00:30:02.811 "aliases": [ 00:30:02.811 "lvs0/lv0" 00:30:02.811 ], 00:30:02.811 "product_name": "Logical Volume", 00:30:02.811 "block_size": 512, 00:30:02.811 "num_blocks": 204800, 00:30:02.811 "uuid": "6e99e2c2-fdda-4367-bb9b-8c4030ecfd03", 00:30:02.811 "assigned_rate_limits": { 00:30:02.811 "rw_ios_per_sec": 0, 00:30:02.811 "rw_mbytes_per_sec": 0, 00:30:02.811 "r_mbytes_per_sec": 0, 00:30:02.811 "w_mbytes_per_sec": 0 00:30:02.811 }, 00:30:02.811 "claimed": false, 00:30:02.811 "zoned": false, 00:30:02.811 "supported_io_types": { 00:30:02.811 "read": true, 00:30:02.812 "write": true, 00:30:02.812 "unmap": true, 00:30:02.812 "flush": false, 00:30:02.812 "reset": true, 00:30:02.812 "nvme_admin": false, 00:30:02.812 "nvme_io": false, 00:30:02.812 "nvme_io_md": false, 00:30:02.812 "write_zeroes": true, 00:30:02.812 "zcopy": false, 00:30:02.812 "get_zone_info": false, 00:30:02.812 "zone_management": false, 00:30:02.812 "zone_append": false, 00:30:02.812 "compare": false, 00:30:02.812 "compare_and_write": false, 00:30:02.812 "abort": false, 00:30:02.812 "seek_hole": true, 00:30:02.812 "seek_data": true, 00:30:02.812 "copy": false, 00:30:02.812 "nvme_iov_md": false 00:30:02.812 }, 00:30:02.812 "driver_specific": { 00:30:02.812 "lvol": { 00:30:02.812 "lvol_store_uuid": "48c9b5b7-bd64-4a7e-8b60-b1b055da9150", 00:30:02.812 "base_bdev": "Nvme0n1", 00:30:02.812 "thin_provision": true, 00:30:02.812 "num_allocated_clusters": 0, 00:30:02.812 "snapshot": false, 00:30:02.812 "clone": false, 00:30:02.812 "esnap_clone": false 00:30:02.812 } 00:30:02.812 } 00:30:02.812 } 00:30:02.812 ] 00:30:02.812 13:48:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:02.812 13:48:42 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:02.812 13:48:42 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:03.069 [2024-07-15 13:48:42.295099] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:03.069 COMP_lvs0/lv0 00:30:03.069 13:48:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:03.069 13:48:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:03.327 13:48:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:03.584 [ 00:30:03.584 { 00:30:03.584 "name": "COMP_lvs0/lv0", 00:30:03.584 "aliases": [ 00:30:03.584 "baf7d3e3-0b2d-5d26-80b3-0ad5a9c416d0" 00:30:03.584 ], 00:30:03.584 "product_name": "compress", 00:30:03.584 "block_size": 512, 00:30:03.584 "num_blocks": 200704, 00:30:03.584 "uuid": "baf7d3e3-0b2d-5d26-80b3-0ad5a9c416d0", 00:30:03.584 "assigned_rate_limits": { 00:30:03.584 "rw_ios_per_sec": 0, 00:30:03.584 "rw_mbytes_per_sec": 0, 00:30:03.584 "r_mbytes_per_sec": 0, 00:30:03.584 "w_mbytes_per_sec": 0 00:30:03.584 }, 00:30:03.584 "claimed": false, 00:30:03.584 "zoned": false, 00:30:03.584 "supported_io_types": { 00:30:03.584 "read": true, 00:30:03.584 "write": true, 00:30:03.584 "unmap": false, 00:30:03.584 "flush": false, 00:30:03.584 "reset": false, 00:30:03.584 "nvme_admin": false, 00:30:03.584 "nvme_io": false, 00:30:03.584 "nvme_io_md": false, 00:30:03.584 "write_zeroes": true, 00:30:03.584 "zcopy": false, 00:30:03.584 "get_zone_info": false, 00:30:03.584 "zone_management": false, 00:30:03.584 "zone_append": false, 00:30:03.584 "compare": false, 00:30:03.584 "compare_and_write": false, 00:30:03.584 "abort": false, 00:30:03.584 "seek_hole": false, 00:30:03.584 "seek_data": false, 00:30:03.584 "copy": false, 00:30:03.584 "nvme_iov_md": false 00:30:03.584 }, 00:30:03.584 "driver_specific": { 00:30:03.584 "compress": { 00:30:03.584 "name": "COMP_lvs0/lv0", 00:30:03.584 "base_bdev_name": "6e99e2c2-fdda-4367-bb9b-8c4030ecfd03" 00:30:03.584 } 00:30:03.584 } 00:30:03.584 } 00:30:03.584 ] 00:30:03.584 13:48:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:03.584 13:48:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:03.584 Running I/O for 3 seconds... 00:30:06.861 00:30:06.862 Latency(us) 00:30:06.862 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:06.862 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:06.862 Verification LBA range: start 0x0 length 0x3100 00:30:06.862 COMP_lvs0/lv0 : 3.00 3923.24 15.33 0.00 0.00 8100.14 744.40 7094.98 00:30:06.862 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:06.862 Verification LBA range: start 0x3100 length 0x3100 00:30:06.862 COMP_lvs0/lv0 : 3.00 3925.85 15.34 0.00 0.00 8107.89 537.82 7038.00 00:30:06.862 =================================================================================================================== 00:30:06.862 Total : 7849.10 30.66 0.00 0.00 8104.02 537.82 7094.98 00:30:06.862 0 00:30:06.862 13:48:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:06.862 13:48:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:06.862 13:48:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:07.120 13:48:46 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:07.120 13:48:46 compress_isal -- compress/compress.sh@78 -- # killprocess 2241882 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2241882 ']' 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2241882 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2241882 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2241882' 00:30:07.120 killing process with pid 2241882 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@967 -- # kill 2241882 00:30:07.120 Received shutdown signal, test time was about 3.000000 seconds 00:30:07.120 00:30:07.120 Latency(us) 00:30:07.120 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:07.120 =================================================================================================================== 00:30:07.120 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:07.120 13:48:46 compress_isal -- common/autotest_common.sh@972 -- # wait 2241882 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2243483 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:10.400 13:48:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2243483 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2243483 ']' 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:10.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:10.400 13:48:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:10.400 [2024-07-15 13:48:49.589096] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:10.400 [2024-07-15 13:48:49.589176] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2243483 ] 00:30:10.400 [2024-07-15 13:48:49.712557] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:10.400 [2024-07-15 13:48:49.812023] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:10.400 [2024-07-15 13:48:49.812030] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:11.362 13:48:50 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:11.362 13:48:50 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:11.362 13:48:50 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:11.362 13:48:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:11.362 13:48:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:11.942 13:48:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:11.942 13:48:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:12.199 [ 00:30:12.199 { 00:30:12.199 "name": "Nvme0n1", 00:30:12.199 "aliases": [ 00:30:12.199 "01000000-0000-0000-5cd2-e43197705251" 00:30:12.199 ], 00:30:12.199 "product_name": "NVMe disk", 00:30:12.199 "block_size": 512, 00:30:12.199 "num_blocks": 15002931888, 00:30:12.199 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:12.199 "assigned_rate_limits": { 00:30:12.199 "rw_ios_per_sec": 0, 00:30:12.199 "rw_mbytes_per_sec": 0, 00:30:12.199 "r_mbytes_per_sec": 0, 00:30:12.199 "w_mbytes_per_sec": 0 00:30:12.199 }, 00:30:12.199 "claimed": false, 00:30:12.199 "zoned": false, 00:30:12.199 "supported_io_types": { 00:30:12.199 "read": true, 00:30:12.199 "write": true, 00:30:12.199 "unmap": true, 00:30:12.199 "flush": true, 00:30:12.199 "reset": true, 00:30:12.199 "nvme_admin": true, 00:30:12.199 "nvme_io": true, 00:30:12.199 "nvme_io_md": false, 00:30:12.199 "write_zeroes": true, 00:30:12.199 "zcopy": false, 00:30:12.199 "get_zone_info": false, 00:30:12.199 "zone_management": false, 00:30:12.199 "zone_append": false, 00:30:12.199 "compare": false, 00:30:12.199 "compare_and_write": false, 00:30:12.199 "abort": true, 00:30:12.199 "seek_hole": false, 00:30:12.199 "seek_data": false, 00:30:12.199 "copy": false, 00:30:12.199 "nvme_iov_md": false 00:30:12.199 }, 00:30:12.199 "driver_specific": { 00:30:12.199 "nvme": [ 00:30:12.199 { 00:30:12.199 "pci_address": "0000:5e:00.0", 00:30:12.199 "trid": { 00:30:12.199 "trtype": "PCIe", 00:30:12.199 "traddr": "0000:5e:00.0" 00:30:12.199 }, 00:30:12.199 "ctrlr_data": { 00:30:12.199 "cntlid": 0, 00:30:12.199 "vendor_id": "0x8086", 00:30:12.199 "model_number": "INTEL SSDPF2KX076TZO", 00:30:12.199 "serial_number": "PHAC0301002G7P6CGN", 00:30:12.199 "firmware_revision": "JCV10200", 00:30:12.199 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:12.199 "oacs": { 00:30:12.199 "security": 1, 00:30:12.199 "format": 1, 00:30:12.199 "firmware": 1, 00:30:12.199 "ns_manage": 1 00:30:12.199 }, 00:30:12.199 "multi_ctrlr": false, 00:30:12.199 "ana_reporting": false 00:30:12.199 }, 00:30:12.199 "vs": { 00:30:12.199 "nvme_version": "1.3" 00:30:12.199 }, 00:30:12.199 "ns_data": { 00:30:12.199 "id": 1, 00:30:12.199 "can_share": false 00:30:12.199 }, 00:30:12.199 "security": { 00:30:12.199 "opal": true 00:30:12.199 } 00:30:12.199 } 00:30:12.199 ], 00:30:12.199 "mp_policy": "active_passive" 00:30:12.199 } 00:30:12.199 } 00:30:12.199 ] 00:30:12.199 13:48:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:12.199 13:48:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:14.721 b92d7238-7a52-4809-9a8a-ca8e97b27230 00:30:14.721 13:48:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:14.977 167c2003-a21f-4397-bfbe-f56ef7005f5c 00:30:14.977 13:48:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:14.977 13:48:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:15.234 13:48:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:15.234 [ 00:30:15.234 { 00:30:15.234 "name": "167c2003-a21f-4397-bfbe-f56ef7005f5c", 00:30:15.234 "aliases": [ 00:30:15.234 "lvs0/lv0" 00:30:15.234 ], 00:30:15.234 "product_name": "Logical Volume", 00:30:15.234 "block_size": 512, 00:30:15.234 "num_blocks": 204800, 00:30:15.234 "uuid": "167c2003-a21f-4397-bfbe-f56ef7005f5c", 00:30:15.234 "assigned_rate_limits": { 00:30:15.234 "rw_ios_per_sec": 0, 00:30:15.234 "rw_mbytes_per_sec": 0, 00:30:15.234 "r_mbytes_per_sec": 0, 00:30:15.234 "w_mbytes_per_sec": 0 00:30:15.234 }, 00:30:15.234 "claimed": false, 00:30:15.234 "zoned": false, 00:30:15.234 "supported_io_types": { 00:30:15.234 "read": true, 00:30:15.234 "write": true, 00:30:15.234 "unmap": true, 00:30:15.234 "flush": false, 00:30:15.234 "reset": true, 00:30:15.234 "nvme_admin": false, 00:30:15.234 "nvme_io": false, 00:30:15.234 "nvme_io_md": false, 00:30:15.234 "write_zeroes": true, 00:30:15.234 "zcopy": false, 00:30:15.234 "get_zone_info": false, 00:30:15.234 "zone_management": false, 00:30:15.234 "zone_append": false, 00:30:15.234 "compare": false, 00:30:15.234 "compare_and_write": false, 00:30:15.234 "abort": false, 00:30:15.234 "seek_hole": true, 00:30:15.234 "seek_data": true, 00:30:15.234 "copy": false, 00:30:15.234 "nvme_iov_md": false 00:30:15.234 }, 00:30:15.234 "driver_specific": { 00:30:15.234 "lvol": { 00:30:15.234 "lvol_store_uuid": "b92d7238-7a52-4809-9a8a-ca8e97b27230", 00:30:15.234 "base_bdev": "Nvme0n1", 00:30:15.234 "thin_provision": true, 00:30:15.234 "num_allocated_clusters": 0, 00:30:15.234 "snapshot": false, 00:30:15.234 "clone": false, 00:30:15.234 "esnap_clone": false 00:30:15.234 } 00:30:15.234 } 00:30:15.234 } 00:30:15.234 ] 00:30:15.491 13:48:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:15.491 13:48:54 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:15.491 13:48:54 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:15.491 [2024-07-15 13:48:54.906728] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:15.491 COMP_lvs0/lv0 00:30:15.748 13:48:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:15.748 13:48:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:15.748 13:48:55 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:16.005 [ 00:30:16.005 { 00:30:16.005 "name": "COMP_lvs0/lv0", 00:30:16.005 "aliases": [ 00:30:16.005 "6d91d79e-ef5c-5c8e-a907-4d83f72c35bf" 00:30:16.005 ], 00:30:16.005 "product_name": "compress", 00:30:16.005 "block_size": 512, 00:30:16.005 "num_blocks": 200704, 00:30:16.005 "uuid": "6d91d79e-ef5c-5c8e-a907-4d83f72c35bf", 00:30:16.005 "assigned_rate_limits": { 00:30:16.005 "rw_ios_per_sec": 0, 00:30:16.005 "rw_mbytes_per_sec": 0, 00:30:16.005 "r_mbytes_per_sec": 0, 00:30:16.005 "w_mbytes_per_sec": 0 00:30:16.005 }, 00:30:16.005 "claimed": false, 00:30:16.005 "zoned": false, 00:30:16.005 "supported_io_types": { 00:30:16.005 "read": true, 00:30:16.005 "write": true, 00:30:16.005 "unmap": false, 00:30:16.005 "flush": false, 00:30:16.005 "reset": false, 00:30:16.005 "nvme_admin": false, 00:30:16.005 "nvme_io": false, 00:30:16.005 "nvme_io_md": false, 00:30:16.005 "write_zeroes": true, 00:30:16.005 "zcopy": false, 00:30:16.005 "get_zone_info": false, 00:30:16.005 "zone_management": false, 00:30:16.005 "zone_append": false, 00:30:16.005 "compare": false, 00:30:16.005 "compare_and_write": false, 00:30:16.005 "abort": false, 00:30:16.005 "seek_hole": false, 00:30:16.005 "seek_data": false, 00:30:16.005 "copy": false, 00:30:16.005 "nvme_iov_md": false 00:30:16.005 }, 00:30:16.005 "driver_specific": { 00:30:16.005 "compress": { 00:30:16.005 "name": "COMP_lvs0/lv0", 00:30:16.005 "base_bdev_name": "167c2003-a21f-4397-bfbe-f56ef7005f5c" 00:30:16.005 } 00:30:16.005 } 00:30:16.005 } 00:30:16.005 ] 00:30:16.005 13:48:55 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:16.005 13:48:55 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:16.263 Running I/O for 3 seconds... 00:30:19.542 00:30:19.542 Latency(us) 00:30:19.542 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:19.542 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:19.542 Verification LBA range: start 0x0 length 0x3100 00:30:19.542 COMP_lvs0/lv0 : 3.01 2883.24 11.26 0.00 0.00 11051.00 666.05 9061.06 00:30:19.542 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:19.542 Verification LBA range: start 0x3100 length 0x3100 00:30:19.542 COMP_lvs0/lv0 : 3.01 2879.92 11.25 0.00 0.00 11071.62 961.67 9516.97 00:30:19.542 =================================================================================================================== 00:30:19.542 Total : 5763.16 22.51 0.00 0.00 11061.30 666.05 9516.97 00:30:19.542 0 00:30:19.542 13:48:58 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:19.542 13:48:58 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:19.542 13:48:58 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:19.799 13:48:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:19.799 13:48:59 compress_isal -- compress/compress.sh@78 -- # killprocess 2243483 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2243483 ']' 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2243483 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2243483 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2243483' 00:30:19.799 killing process with pid 2243483 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@967 -- # kill 2243483 00:30:19.799 Received shutdown signal, test time was about 3.000000 seconds 00:30:19.799 00:30:19.799 Latency(us) 00:30:19.799 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:19.799 =================================================================================================================== 00:30:19.799 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:19.799 13:48:59 compress_isal -- common/autotest_common.sh@972 -- # wait 2243483 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2245082 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:23.079 13:49:01 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2245082 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2245082 ']' 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:23.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:23.079 13:49:01 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:23.079 [2024-07-15 13:49:01.939644] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:23.079 [2024-07-15 13:49:01.939718] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2245082 ] 00:30:23.079 [2024-07-15 13:49:02.062145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:23.079 [2024-07-15 13:49:02.171945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:23.079 [2024-07-15 13:49:02.171995] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:23.644 13:49:02 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:23.644 13:49:02 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:23.644 13:49:02 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:23.644 13:49:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:23.644 13:49:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:24.216 13:49:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:24.216 13:49:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:24.475 13:49:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:24.475 [ 00:30:24.475 { 00:30:24.475 "name": "Nvme0n1", 00:30:24.475 "aliases": [ 00:30:24.475 "01000000-0000-0000-5cd2-e43197705251" 00:30:24.475 ], 00:30:24.475 "product_name": "NVMe disk", 00:30:24.475 "block_size": 512, 00:30:24.475 "num_blocks": 15002931888, 00:30:24.475 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:24.475 "assigned_rate_limits": { 00:30:24.475 "rw_ios_per_sec": 0, 00:30:24.475 "rw_mbytes_per_sec": 0, 00:30:24.475 "r_mbytes_per_sec": 0, 00:30:24.475 "w_mbytes_per_sec": 0 00:30:24.475 }, 00:30:24.475 "claimed": false, 00:30:24.475 "zoned": false, 00:30:24.475 "supported_io_types": { 00:30:24.475 "read": true, 00:30:24.475 "write": true, 00:30:24.475 "unmap": true, 00:30:24.475 "flush": true, 00:30:24.475 "reset": true, 00:30:24.475 "nvme_admin": true, 00:30:24.475 "nvme_io": true, 00:30:24.475 "nvme_io_md": false, 00:30:24.475 "write_zeroes": true, 00:30:24.475 "zcopy": false, 00:30:24.475 "get_zone_info": false, 00:30:24.475 "zone_management": false, 00:30:24.475 "zone_append": false, 00:30:24.475 "compare": false, 00:30:24.475 "compare_and_write": false, 00:30:24.475 "abort": true, 00:30:24.475 "seek_hole": false, 00:30:24.475 "seek_data": false, 00:30:24.475 "copy": false, 00:30:24.475 "nvme_iov_md": false 00:30:24.475 }, 00:30:24.475 "driver_specific": { 00:30:24.475 "nvme": [ 00:30:24.475 { 00:30:24.475 "pci_address": "0000:5e:00.0", 00:30:24.475 "trid": { 00:30:24.475 "trtype": "PCIe", 00:30:24.475 "traddr": "0000:5e:00.0" 00:30:24.475 }, 00:30:24.475 "ctrlr_data": { 00:30:24.475 "cntlid": 0, 00:30:24.475 "vendor_id": "0x8086", 00:30:24.475 "model_number": "INTEL SSDPF2KX076TZO", 00:30:24.475 "serial_number": "PHAC0301002G7P6CGN", 00:30:24.475 "firmware_revision": "JCV10200", 00:30:24.475 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:24.475 "oacs": { 00:30:24.475 "security": 1, 00:30:24.475 "format": 1, 00:30:24.475 "firmware": 1, 00:30:24.475 "ns_manage": 1 00:30:24.475 }, 00:30:24.475 "multi_ctrlr": false, 00:30:24.475 "ana_reporting": false 00:30:24.475 }, 00:30:24.475 "vs": { 00:30:24.475 "nvme_version": "1.3" 00:30:24.475 }, 00:30:24.475 "ns_data": { 00:30:24.475 "id": 1, 00:30:24.475 "can_share": false 00:30:24.475 }, 00:30:24.475 "security": { 00:30:24.475 "opal": true 00:30:24.475 } 00:30:24.475 } 00:30:24.475 ], 00:30:24.475 "mp_policy": "active_passive" 00:30:24.475 } 00:30:24.475 } 00:30:24.475 ] 00:30:24.733 13:49:03 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:24.733 13:49:03 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:27.258 c7869633-9c30-4b16-a930-db818189282b 00:30:27.258 13:49:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:27.258 c1c40795-0917-4bd9-ac4a-e944f3ee8f1b 00:30:27.258 13:49:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:27.258 13:49:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:27.514 13:49:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:27.770 [ 00:30:27.770 { 00:30:27.770 "name": "c1c40795-0917-4bd9-ac4a-e944f3ee8f1b", 00:30:27.770 "aliases": [ 00:30:27.770 "lvs0/lv0" 00:30:27.770 ], 00:30:27.770 "product_name": "Logical Volume", 00:30:27.770 "block_size": 512, 00:30:27.770 "num_blocks": 204800, 00:30:27.770 "uuid": "c1c40795-0917-4bd9-ac4a-e944f3ee8f1b", 00:30:27.770 "assigned_rate_limits": { 00:30:27.770 "rw_ios_per_sec": 0, 00:30:27.770 "rw_mbytes_per_sec": 0, 00:30:27.770 "r_mbytes_per_sec": 0, 00:30:27.770 "w_mbytes_per_sec": 0 00:30:27.770 }, 00:30:27.770 "claimed": false, 00:30:27.770 "zoned": false, 00:30:27.770 "supported_io_types": { 00:30:27.770 "read": true, 00:30:27.770 "write": true, 00:30:27.770 "unmap": true, 00:30:27.770 "flush": false, 00:30:27.770 "reset": true, 00:30:27.770 "nvme_admin": false, 00:30:27.770 "nvme_io": false, 00:30:27.770 "nvme_io_md": false, 00:30:27.770 "write_zeroes": true, 00:30:27.770 "zcopy": false, 00:30:27.770 "get_zone_info": false, 00:30:27.770 "zone_management": false, 00:30:27.770 "zone_append": false, 00:30:27.770 "compare": false, 00:30:27.770 "compare_and_write": false, 00:30:27.770 "abort": false, 00:30:27.770 "seek_hole": true, 00:30:27.770 "seek_data": true, 00:30:27.770 "copy": false, 00:30:27.770 "nvme_iov_md": false 00:30:27.770 }, 00:30:27.770 "driver_specific": { 00:30:27.770 "lvol": { 00:30:27.770 "lvol_store_uuid": "c7869633-9c30-4b16-a930-db818189282b", 00:30:27.770 "base_bdev": "Nvme0n1", 00:30:27.770 "thin_provision": true, 00:30:27.770 "num_allocated_clusters": 0, 00:30:27.770 "snapshot": false, 00:30:27.770 "clone": false, 00:30:27.770 "esnap_clone": false 00:30:27.770 } 00:30:27.770 } 00:30:27.770 } 00:30:27.770 ] 00:30:27.770 13:49:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:27.770 13:49:07 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:27.770 13:49:07 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:28.027 [2024-07-15 13:49:07.270834] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:28.027 COMP_lvs0/lv0 00:30:28.027 13:49:07 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:28.027 13:49:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:28.284 13:49:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:28.541 [ 00:30:28.541 { 00:30:28.541 "name": "COMP_lvs0/lv0", 00:30:28.541 "aliases": [ 00:30:28.541 "4a860573-de99-5aab-9993-f75fdd44f214" 00:30:28.541 ], 00:30:28.541 "product_name": "compress", 00:30:28.541 "block_size": 4096, 00:30:28.541 "num_blocks": 25088, 00:30:28.541 "uuid": "4a860573-de99-5aab-9993-f75fdd44f214", 00:30:28.541 "assigned_rate_limits": { 00:30:28.541 "rw_ios_per_sec": 0, 00:30:28.541 "rw_mbytes_per_sec": 0, 00:30:28.541 "r_mbytes_per_sec": 0, 00:30:28.541 "w_mbytes_per_sec": 0 00:30:28.541 }, 00:30:28.541 "claimed": false, 00:30:28.541 "zoned": false, 00:30:28.541 "supported_io_types": { 00:30:28.541 "read": true, 00:30:28.541 "write": true, 00:30:28.541 "unmap": false, 00:30:28.541 "flush": false, 00:30:28.541 "reset": false, 00:30:28.541 "nvme_admin": false, 00:30:28.541 "nvme_io": false, 00:30:28.541 "nvme_io_md": false, 00:30:28.541 "write_zeroes": true, 00:30:28.541 "zcopy": false, 00:30:28.541 "get_zone_info": false, 00:30:28.541 "zone_management": false, 00:30:28.541 "zone_append": false, 00:30:28.541 "compare": false, 00:30:28.541 "compare_and_write": false, 00:30:28.541 "abort": false, 00:30:28.541 "seek_hole": false, 00:30:28.541 "seek_data": false, 00:30:28.541 "copy": false, 00:30:28.541 "nvme_iov_md": false 00:30:28.541 }, 00:30:28.541 "driver_specific": { 00:30:28.541 "compress": { 00:30:28.541 "name": "COMP_lvs0/lv0", 00:30:28.541 "base_bdev_name": "c1c40795-0917-4bd9-ac4a-e944f3ee8f1b" 00:30:28.541 } 00:30:28.541 } 00:30:28.541 } 00:30:28.541 ] 00:30:28.542 13:49:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:28.542 13:49:07 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:28.542 Running I/O for 3 seconds... 00:30:31.860 00:30:31.860 Latency(us) 00:30:31.860 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.860 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:31.860 Verification LBA range: start 0x0 length 0x3100 00:30:31.860 COMP_lvs0/lv0 : 3.00 3919.88 15.31 0.00 0.00 8107.96 687.42 8263.23 00:30:31.860 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:31.860 Verification LBA range: start 0x3100 length 0x3100 00:30:31.860 COMP_lvs0/lv0 : 3.00 3922.06 15.32 0.00 0.00 8116.49 541.38 8377.21 00:30:31.860 =================================================================================================================== 00:30:31.860 Total : 7841.94 30.63 0.00 0.00 8112.23 541.38 8377.21 00:30:31.860 0 00:30:31.860 13:49:10 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:31.860 13:49:10 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:31.860 13:49:11 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:32.117 13:49:11 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:32.117 13:49:11 compress_isal -- compress/compress.sh@78 -- # killprocess 2245082 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2245082 ']' 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2245082 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2245082 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2245082' 00:30:32.117 killing process with pid 2245082 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@967 -- # kill 2245082 00:30:32.117 Received shutdown signal, test time was about 3.000000 seconds 00:30:32.117 00:30:32.117 Latency(us) 00:30:32.117 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.117 =================================================================================================================== 00:30:32.117 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:32.117 13:49:11 compress_isal -- common/autotest_common.sh@972 -- # wait 2245082 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2246680 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:30:35.395 13:49:14 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2246680 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2246680 ']' 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:35.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:35.395 13:49:14 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:35.395 [2024-07-15 13:49:14.448990] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:35.395 [2024-07-15 13:49:14.449065] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2246680 ] 00:30:35.395 [2024-07-15 13:49:14.580483] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:35.395 [2024-07-15 13:49:14.687003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:35.395 [2024-07-15 13:49:14.687091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:35.395 [2024-07-15 13:49:14.687096] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:35.961 13:49:15 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:35.961 13:49:15 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:35.961 13:49:15 compress_isal -- compress/compress.sh@58 -- # create_vols 00:30:35.961 13:49:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:35.961 13:49:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:36.526 13:49:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:36.526 13:49:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:36.784 13:49:16 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:37.041 [ 00:30:37.041 { 00:30:37.041 "name": "Nvme0n1", 00:30:37.041 "aliases": [ 00:30:37.041 "01000000-0000-0000-5cd2-e43197705251" 00:30:37.041 ], 00:30:37.041 "product_name": "NVMe disk", 00:30:37.041 "block_size": 512, 00:30:37.041 "num_blocks": 15002931888, 00:30:37.041 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:37.041 "assigned_rate_limits": { 00:30:37.041 "rw_ios_per_sec": 0, 00:30:37.041 "rw_mbytes_per_sec": 0, 00:30:37.041 "r_mbytes_per_sec": 0, 00:30:37.041 "w_mbytes_per_sec": 0 00:30:37.041 }, 00:30:37.041 "claimed": false, 00:30:37.041 "zoned": false, 00:30:37.041 "supported_io_types": { 00:30:37.041 "read": true, 00:30:37.041 "write": true, 00:30:37.041 "unmap": true, 00:30:37.041 "flush": true, 00:30:37.041 "reset": true, 00:30:37.041 "nvme_admin": true, 00:30:37.041 "nvme_io": true, 00:30:37.041 "nvme_io_md": false, 00:30:37.041 "write_zeroes": true, 00:30:37.041 "zcopy": false, 00:30:37.041 "get_zone_info": false, 00:30:37.041 "zone_management": false, 00:30:37.041 "zone_append": false, 00:30:37.041 "compare": false, 00:30:37.041 "compare_and_write": false, 00:30:37.041 "abort": true, 00:30:37.041 "seek_hole": false, 00:30:37.041 "seek_data": false, 00:30:37.041 "copy": false, 00:30:37.041 "nvme_iov_md": false 00:30:37.041 }, 00:30:37.041 "driver_specific": { 00:30:37.041 "nvme": [ 00:30:37.041 { 00:30:37.041 "pci_address": "0000:5e:00.0", 00:30:37.041 "trid": { 00:30:37.041 "trtype": "PCIe", 00:30:37.041 "traddr": "0000:5e:00.0" 00:30:37.041 }, 00:30:37.041 "ctrlr_data": { 00:30:37.041 "cntlid": 0, 00:30:37.041 "vendor_id": "0x8086", 00:30:37.041 "model_number": "INTEL SSDPF2KX076TZO", 00:30:37.041 "serial_number": "PHAC0301002G7P6CGN", 00:30:37.041 "firmware_revision": "JCV10200", 00:30:37.041 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:37.041 "oacs": { 00:30:37.041 "security": 1, 00:30:37.041 "format": 1, 00:30:37.041 "firmware": 1, 00:30:37.041 "ns_manage": 1 00:30:37.041 }, 00:30:37.042 "multi_ctrlr": false, 00:30:37.042 "ana_reporting": false 00:30:37.042 }, 00:30:37.042 "vs": { 00:30:37.042 "nvme_version": "1.3" 00:30:37.042 }, 00:30:37.042 "ns_data": { 00:30:37.042 "id": 1, 00:30:37.042 "can_share": false 00:30:37.042 }, 00:30:37.042 "security": { 00:30:37.042 "opal": true 00:30:37.042 } 00:30:37.042 } 00:30:37.042 ], 00:30:37.042 "mp_policy": "active_passive" 00:30:37.042 } 00:30:37.042 } 00:30:37.042 ] 00:30:37.042 13:49:16 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:37.042 13:49:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:39.568 8229ba80-217d-4469-a1c3-60d4361efd1f 00:30:39.568 13:49:18 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:39.826 0904ab8d-62f0-4d6b-8bf4-fe83d3fd8e06 00:30:39.826 13:49:19 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:39.826 13:49:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:40.083 13:49:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:40.341 [ 00:30:40.341 { 00:30:40.341 "name": "0904ab8d-62f0-4d6b-8bf4-fe83d3fd8e06", 00:30:40.341 "aliases": [ 00:30:40.341 "lvs0/lv0" 00:30:40.341 ], 00:30:40.341 "product_name": "Logical Volume", 00:30:40.341 "block_size": 512, 00:30:40.341 "num_blocks": 204800, 00:30:40.341 "uuid": "0904ab8d-62f0-4d6b-8bf4-fe83d3fd8e06", 00:30:40.341 "assigned_rate_limits": { 00:30:40.341 "rw_ios_per_sec": 0, 00:30:40.341 "rw_mbytes_per_sec": 0, 00:30:40.341 "r_mbytes_per_sec": 0, 00:30:40.341 "w_mbytes_per_sec": 0 00:30:40.341 }, 00:30:40.341 "claimed": false, 00:30:40.341 "zoned": false, 00:30:40.341 "supported_io_types": { 00:30:40.341 "read": true, 00:30:40.341 "write": true, 00:30:40.341 "unmap": true, 00:30:40.341 "flush": false, 00:30:40.341 "reset": true, 00:30:40.341 "nvme_admin": false, 00:30:40.341 "nvme_io": false, 00:30:40.341 "nvme_io_md": false, 00:30:40.341 "write_zeroes": true, 00:30:40.341 "zcopy": false, 00:30:40.341 "get_zone_info": false, 00:30:40.341 "zone_management": false, 00:30:40.341 "zone_append": false, 00:30:40.341 "compare": false, 00:30:40.341 "compare_and_write": false, 00:30:40.341 "abort": false, 00:30:40.341 "seek_hole": true, 00:30:40.341 "seek_data": true, 00:30:40.341 "copy": false, 00:30:40.341 "nvme_iov_md": false 00:30:40.341 }, 00:30:40.341 "driver_specific": { 00:30:40.341 "lvol": { 00:30:40.341 "lvol_store_uuid": "8229ba80-217d-4469-a1c3-60d4361efd1f", 00:30:40.341 "base_bdev": "Nvme0n1", 00:30:40.341 "thin_provision": true, 00:30:40.341 "num_allocated_clusters": 0, 00:30:40.341 "snapshot": false, 00:30:40.341 "clone": false, 00:30:40.341 "esnap_clone": false 00:30:40.341 } 00:30:40.341 } 00:30:40.341 } 00:30:40.341 ] 00:30:40.341 13:49:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:40.341 13:49:19 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:40.341 13:49:19 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:40.598 [2024-07-15 13:49:19.768433] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:40.598 COMP_lvs0/lv0 00:30:40.598 13:49:19 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:40.598 13:49:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:40.598 13:49:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:40.598 13:49:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:40.598 13:49:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:40.599 13:49:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:40.599 13:49:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:40.860 13:49:20 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:40.860 [ 00:30:40.860 { 00:30:40.860 "name": "COMP_lvs0/lv0", 00:30:40.860 "aliases": [ 00:30:40.860 "f126157b-42b9-56a9-9fa6-91ca681d0de0" 00:30:40.860 ], 00:30:40.860 "product_name": "compress", 00:30:40.860 "block_size": 512, 00:30:40.860 "num_blocks": 200704, 00:30:40.860 "uuid": "f126157b-42b9-56a9-9fa6-91ca681d0de0", 00:30:40.860 "assigned_rate_limits": { 00:30:40.860 "rw_ios_per_sec": 0, 00:30:40.860 "rw_mbytes_per_sec": 0, 00:30:40.860 "r_mbytes_per_sec": 0, 00:30:40.860 "w_mbytes_per_sec": 0 00:30:40.860 }, 00:30:40.860 "claimed": false, 00:30:40.860 "zoned": false, 00:30:40.860 "supported_io_types": { 00:30:40.860 "read": true, 00:30:40.860 "write": true, 00:30:40.860 "unmap": false, 00:30:40.860 "flush": false, 00:30:40.860 "reset": false, 00:30:40.860 "nvme_admin": false, 00:30:40.860 "nvme_io": false, 00:30:40.860 "nvme_io_md": false, 00:30:40.860 "write_zeroes": true, 00:30:40.860 "zcopy": false, 00:30:40.860 "get_zone_info": false, 00:30:40.860 "zone_management": false, 00:30:40.860 "zone_append": false, 00:30:40.860 "compare": false, 00:30:40.860 "compare_and_write": false, 00:30:40.860 "abort": false, 00:30:40.860 "seek_hole": false, 00:30:40.860 "seek_data": false, 00:30:40.860 "copy": false, 00:30:40.860 "nvme_iov_md": false 00:30:40.860 }, 00:30:40.860 "driver_specific": { 00:30:40.860 "compress": { 00:30:40.860 "name": "COMP_lvs0/lv0", 00:30:40.860 "base_bdev_name": "0904ab8d-62f0-4d6b-8bf4-fe83d3fd8e06" 00:30:40.860 } 00:30:40.860 } 00:30:40.860 } 00:30:40.860 ] 00:30:41.119 13:49:20 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:41.119 13:49:20 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:41.119 I/O targets: 00:30:41.119 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:41.119 00:30:41.119 00:30:41.119 CUnit - A unit testing framework for C - Version 2.1-3 00:30:41.119 http://cunit.sourceforge.net/ 00:30:41.119 00:30:41.119 00:30:41.119 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:41.119 Test: blockdev write read block ...passed 00:30:41.119 Test: blockdev write zeroes read block ...passed 00:30:41.119 Test: blockdev write zeroes read no split ...passed 00:30:41.119 Test: blockdev write zeroes read split ...passed 00:30:41.119 Test: blockdev write zeroes read split partial ...passed 00:30:41.119 Test: blockdev reset ...[2024-07-15 13:49:20.447561] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:41.119 passed 00:30:41.119 Test: blockdev write read 8 blocks ...passed 00:30:41.119 Test: blockdev write read size > 128k ...passed 00:30:41.119 Test: blockdev write read invalid size ...passed 00:30:41.119 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:41.119 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:41.119 Test: blockdev write read max offset ...passed 00:30:41.119 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:41.119 Test: blockdev writev readv 8 blocks ...passed 00:30:41.119 Test: blockdev writev readv 30 x 1block ...passed 00:30:41.119 Test: blockdev writev readv block ...passed 00:30:41.119 Test: blockdev writev readv size > 128k ...passed 00:30:41.119 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:41.119 Test: blockdev comparev and writev ...passed 00:30:41.119 Test: blockdev nvme passthru rw ...passed 00:30:41.119 Test: blockdev nvme passthru vendor specific ...passed 00:30:41.119 Test: blockdev nvme admin passthru ...passed 00:30:41.119 Test: blockdev copy ...passed 00:30:41.119 00:30:41.119 Run Summary: Type Total Ran Passed Failed Inactive 00:30:41.119 suites 1 1 n/a 0 0 00:30:41.119 tests 23 23 23 0 0 00:30:41.119 asserts 130 130 130 0 n/a 00:30:41.119 00:30:41.119 Elapsed time = 0.107 seconds 00:30:41.119 0 00:30:41.119 13:49:20 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:30:41.119 13:49:20 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:41.378 13:49:20 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:41.637 13:49:20 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:41.637 13:49:20 compress_isal -- compress/compress.sh@62 -- # killprocess 2246680 00:30:41.637 13:49:20 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2246680 ']' 00:30:41.637 13:49:20 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2246680 00:30:41.637 13:49:20 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:41.637 13:49:20 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:41.637 13:49:20 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2246680 00:30:41.637 13:49:21 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:41.637 13:49:21 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:41.637 13:49:21 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2246680' 00:30:41.637 killing process with pid 2246680 00:30:41.637 13:49:21 compress_isal -- common/autotest_common.sh@967 -- # kill 2246680 00:30:41.637 13:49:21 compress_isal -- common/autotest_common.sh@972 -- # wait 2246680 00:30:44.919 13:49:23 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:44.919 13:49:23 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:44.919 00:30:44.919 real 0m47.322s 00:30:44.919 user 1m50.587s 00:30:44.919 sys 0m4.255s 00:30:44.919 13:49:23 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:44.919 13:49:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:44.919 ************************************ 00:30:44.919 END TEST compress_isal 00:30:44.919 ************************************ 00:30:44.919 13:49:24 -- common/autotest_common.sh@1142 -- # return 0 00:30:44.919 13:49:24 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:30:44.919 13:49:24 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:30:44.919 13:49:24 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:44.919 13:49:24 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:44.919 13:49:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:44.919 13:49:24 -- common/autotest_common.sh@10 -- # set +x 00:30:44.919 ************************************ 00:30:44.919 START TEST blockdev_crypto_aesni 00:30:44.919 ************************************ 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:44.919 * Looking for test storage... 00:30:44.919 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2247980 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:44.919 13:49:24 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2247980 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2247980 ']' 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:44.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:44.919 13:49:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:44.919 [2024-07-15 13:49:24.239987] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:44.919 [2024-07-15 13:49:24.240063] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2247980 ] 00:30:45.177 [2024-07-15 13:49:24.369370] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:45.177 [2024-07-15 13:49:24.472087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:45.740 13:49:25 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:45.740 13:49:25 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:30:45.740 13:49:25 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:45.740 13:49:25 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:30:45.740 13:49:25 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:30:45.740 13:49:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:45.740 13:49:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:45.997 [2024-07-15 13:49:25.174310] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:45.997 [2024-07-15 13:49:25.182341] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:45.997 [2024-07-15 13:49:25.190358] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:45.997 [2024-07-15 13:49:25.264484] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:48.539 true 00:30:48.539 true 00:30:48.539 true 00:30:48.539 true 00:30:48.539 Malloc0 00:30:48.539 Malloc1 00:30:48.539 Malloc2 00:30:48.539 Malloc3 00:30:48.539 [2024-07-15 13:49:27.661557] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:48.539 crypto_ram 00:30:48.539 [2024-07-15 13:49:27.669575] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:48.539 crypto_ram2 00:30:48.539 [2024-07-15 13:49:27.677598] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:48.539 crypto_ram3 00:30:48.539 [2024-07-15 13:49:27.685620] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:48.539 crypto_ram4 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.539 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.539 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "23c2f3d9-268b-5beb-9573-e50f7f7e58ce"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "23c2f3d9-268b-5beb-9573-e50f7f7e58ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ebeb8367-26f1-539f-b2dc-7d9f695b2e61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ebeb8367-26f1-539f-b2dc-7d9f695b2e61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "153f85f6-67d3-5090-9a6d-385c904b368c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "153f85f6-67d3-5090-9a6d-385c904b368c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:48.540 13:49:27 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2247980 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2247980 ']' 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2247980 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2247980 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2247980' 00:30:48.540 killing process with pid 2247980 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2247980 00:30:48.540 13:49:27 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2247980 00:30:49.105 13:49:28 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:49.105 13:49:28 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:49.105 13:49:28 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:49.105 13:49:28 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:49.105 13:49:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:49.363 ************************************ 00:30:49.363 START TEST bdev_hello_world 00:30:49.363 ************************************ 00:30:49.363 13:49:28 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:49.363 [2024-07-15 13:49:28.583475] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:49.363 [2024-07-15 13:49:28.583535] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248523 ] 00:30:49.363 [2024-07-15 13:49:28.711062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.621 [2024-07-15 13:49:28.809288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.621 [2024-07-15 13:49:28.830594] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:49.621 [2024-07-15 13:49:28.838620] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:49.621 [2024-07-15 13:49:28.846647] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:49.621 [2024-07-15 13:49:28.957569] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:52.215 [2024-07-15 13:49:31.177303] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:52.215 [2024-07-15 13:49:31.177369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:52.215 [2024-07-15 13:49:31.177389] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.215 [2024-07-15 13:49:31.185322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:52.215 [2024-07-15 13:49:31.185342] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:52.215 [2024-07-15 13:49:31.185354] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.215 [2024-07-15 13:49:31.193357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:52.215 [2024-07-15 13:49:31.193376] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:52.215 [2024-07-15 13:49:31.193387] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.215 [2024-07-15 13:49:31.201363] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:52.215 [2024-07-15 13:49:31.201380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:52.215 [2024-07-15 13:49:31.201391] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.215 [2024-07-15 13:49:31.274086] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:52.215 [2024-07-15 13:49:31.274130] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:52.215 [2024-07-15 13:49:31.274150] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:52.215 [2024-07-15 13:49:31.275416] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:52.215 [2024-07-15 13:49:31.275486] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:52.216 [2024-07-15 13:49:31.275502] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:52.216 [2024-07-15 13:49:31.275546] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:52.216 00:30:52.216 [2024-07-15 13:49:31.275565] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:52.473 00:30:52.473 real 0m3.125s 00:30:52.473 user 0m2.724s 00:30:52.473 sys 0m0.363s 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:52.473 ************************************ 00:30:52.473 END TEST bdev_hello_world 00:30:52.473 ************************************ 00:30:52.473 13:49:31 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:30:52.473 13:49:31 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:52.473 13:49:31 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:52.473 13:49:31 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:52.473 13:49:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:52.473 ************************************ 00:30:52.473 START TEST bdev_bounds 00:30:52.473 ************************************ 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2248927 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2248927' 00:30:52.473 Process bdevio pid: 2248927 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2248927 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2248927 ']' 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:52.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:52.473 13:49:31 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:52.473 [2024-07-15 13:49:31.811235] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:52.473 [2024-07-15 13:49:31.811305] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2248927 ] 00:30:52.731 [2024-07-15 13:49:31.942266] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:52.731 [2024-07-15 13:49:32.042021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:52.731 [2024-07-15 13:49:32.042106] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:52.731 [2024-07-15 13:49:32.042110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:52.731 [2024-07-15 13:49:32.063499] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:52.731 [2024-07-15 13:49:32.071522] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:52.731 [2024-07-15 13:49:32.079542] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:52.988 [2024-07-15 13:49:32.184910] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:55.524 [2024-07-15 13:49:34.405024] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:55.524 [2024-07-15 13:49:34.405113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:55.524 [2024-07-15 13:49:34.405130] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.524 [2024-07-15 13:49:34.413044] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:55.524 [2024-07-15 13:49:34.413065] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:55.524 [2024-07-15 13:49:34.413077] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.525 [2024-07-15 13:49:34.421068] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:55.525 [2024-07-15 13:49:34.421090] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:55.525 [2024-07-15 13:49:34.421102] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.525 [2024-07-15 13:49:34.429091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:55.525 [2024-07-15 13:49:34.429110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:55.525 [2024-07-15 13:49:34.429122] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:55.525 I/O targets: 00:30:55.525 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:55.525 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:30:55.525 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:55.525 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:30:55.525 00:30:55.525 00:30:55.525 CUnit - A unit testing framework for C - Version 2.1-3 00:30:55.525 http://cunit.sourceforge.net/ 00:30:55.525 00:30:55.525 00:30:55.525 Suite: bdevio tests on: crypto_ram4 00:30:55.525 Test: blockdev write read block ...passed 00:30:55.525 Test: blockdev write zeroes read block ...passed 00:30:55.525 Test: blockdev write zeroes read no split ...passed 00:30:55.525 Test: blockdev write zeroes read split ...passed 00:30:55.525 Test: blockdev write zeroes read split partial ...passed 00:30:55.525 Test: blockdev reset ...passed 00:30:55.525 Test: blockdev write read 8 blocks ...passed 00:30:55.525 Test: blockdev write read size > 128k ...passed 00:30:55.525 Test: blockdev write read invalid size ...passed 00:30:55.525 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.525 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.525 Test: blockdev write read max offset ...passed 00:30:55.525 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.525 Test: blockdev writev readv 8 blocks ...passed 00:30:55.525 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.525 Test: blockdev writev readv block ...passed 00:30:55.525 Test: blockdev writev readv size > 128k ...passed 00:30:55.525 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.525 Test: blockdev comparev and writev ...passed 00:30:55.525 Test: blockdev nvme passthru rw ...passed 00:30:55.525 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.525 Test: blockdev nvme admin passthru ...passed 00:30:55.525 Test: blockdev copy ...passed 00:30:55.525 Suite: bdevio tests on: crypto_ram3 00:30:55.525 Test: blockdev write read block ...passed 00:30:55.525 Test: blockdev write zeroes read block ...passed 00:30:55.525 Test: blockdev write zeroes read no split ...passed 00:30:55.525 Test: blockdev write zeroes read split ...passed 00:30:55.525 Test: blockdev write zeroes read split partial ...passed 00:30:55.525 Test: blockdev reset ...passed 00:30:55.525 Test: blockdev write read 8 blocks ...passed 00:30:55.525 Test: blockdev write read size > 128k ...passed 00:30:55.525 Test: blockdev write read invalid size ...passed 00:30:55.525 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.525 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.525 Test: blockdev write read max offset ...passed 00:30:55.525 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.525 Test: blockdev writev readv 8 blocks ...passed 00:30:55.525 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.525 Test: blockdev writev readv block ...passed 00:30:55.525 Test: blockdev writev readv size > 128k ...passed 00:30:55.525 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.525 Test: blockdev comparev and writev ...passed 00:30:55.525 Test: blockdev nvme passthru rw ...passed 00:30:55.525 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.525 Test: blockdev nvme admin passthru ...passed 00:30:55.525 Test: blockdev copy ...passed 00:30:55.525 Suite: bdevio tests on: crypto_ram2 00:30:55.525 Test: blockdev write read block ...passed 00:30:55.525 Test: blockdev write zeroes read block ...passed 00:30:55.525 Test: blockdev write zeroes read no split ...passed 00:30:55.525 Test: blockdev write zeroes read split ...passed 00:30:55.525 Test: blockdev write zeroes read split partial ...passed 00:30:55.525 Test: blockdev reset ...passed 00:30:55.525 Test: blockdev write read 8 blocks ...passed 00:30:55.525 Test: blockdev write read size > 128k ...passed 00:30:55.525 Test: blockdev write read invalid size ...passed 00:30:55.525 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.525 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.525 Test: blockdev write read max offset ...passed 00:30:55.525 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.525 Test: blockdev writev readv 8 blocks ...passed 00:30:55.525 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.525 Test: blockdev writev readv block ...passed 00:30:55.525 Test: blockdev writev readv size > 128k ...passed 00:30:55.525 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.525 Test: blockdev comparev and writev ...passed 00:30:55.525 Test: blockdev nvme passthru rw ...passed 00:30:55.525 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.525 Test: blockdev nvme admin passthru ...passed 00:30:55.525 Test: blockdev copy ...passed 00:30:55.525 Suite: bdevio tests on: crypto_ram 00:30:55.525 Test: blockdev write read block ...passed 00:30:55.525 Test: blockdev write zeroes read block ...passed 00:30:55.525 Test: blockdev write zeroes read no split ...passed 00:30:55.525 Test: blockdev write zeroes read split ...passed 00:30:55.525 Test: blockdev write zeroes read split partial ...passed 00:30:55.525 Test: blockdev reset ...passed 00:30:55.525 Test: blockdev write read 8 blocks ...passed 00:30:55.525 Test: blockdev write read size > 128k ...passed 00:30:55.525 Test: blockdev write read invalid size ...passed 00:30:55.525 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.525 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.525 Test: blockdev write read max offset ...passed 00:30:55.525 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.525 Test: blockdev writev readv 8 blocks ...passed 00:30:55.525 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.525 Test: blockdev writev readv block ...passed 00:30:55.525 Test: blockdev writev readv size > 128k ...passed 00:30:55.525 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.525 Test: blockdev comparev and writev ...passed 00:30:55.525 Test: blockdev nvme passthru rw ...passed 00:30:55.525 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.525 Test: blockdev nvme admin passthru ...passed 00:30:55.525 Test: blockdev copy ...passed 00:30:55.525 00:30:55.525 Run Summary: Type Total Ran Passed Failed Inactive 00:30:55.525 suites 4 4 n/a 0 0 00:30:55.525 tests 92 92 92 0 0 00:30:55.525 asserts 520 520 520 0 n/a 00:30:55.525 00:30:55.525 Elapsed time = 0.537 seconds 00:30:55.525 0 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2248927 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2248927 ']' 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2248927 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:55.525 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2248927 00:30:55.783 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:55.783 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:55.783 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2248927' 00:30:55.783 killing process with pid 2248927 00:30:55.783 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2248927 00:30:55.783 13:49:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2248927 00:30:56.040 13:49:35 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:56.040 00:30:56.040 real 0m3.598s 00:30:56.040 user 0m10.028s 00:30:56.040 sys 0m0.574s 00:30:56.040 13:49:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:56.040 13:49:35 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:56.040 ************************************ 00:30:56.040 END TEST bdev_bounds 00:30:56.040 ************************************ 00:30:56.040 13:49:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:30:56.040 13:49:35 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:56.040 13:49:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:56.040 13:49:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:56.040 13:49:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:56.040 ************************************ 00:30:56.040 START TEST bdev_nbd 00:30:56.041 ************************************ 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2249451 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2249451 /var/tmp/spdk-nbd.sock 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2249451 ']' 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:56.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:56.041 13:49:35 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:56.298 [2024-07-15 13:49:35.469855] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:30:56.298 [2024-07-15 13:49:35.469899] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:56.298 [2024-07-15 13:49:35.581281] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:56.298 [2024-07-15 13:49:35.685421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:56.298 [2024-07-15 13:49:35.706697] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:56.298 [2024-07-15 13:49:35.714719] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:56.298 [2024-07-15 13:49:35.722743] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:56.555 [2024-07-15 13:49:35.829269] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:59.081 [2024-07-15 13:49:38.049995] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:59.081 [2024-07-15 13:49:38.050054] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:59.081 [2024-07-15 13:49:38.050070] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.081 [2024-07-15 13:49:38.058017] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:59.081 [2024-07-15 13:49:38.058036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:59.081 [2024-07-15 13:49:38.058049] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.081 [2024-07-15 13:49:38.066038] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:59.081 [2024-07-15 13:49:38.066059] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:59.081 [2024-07-15 13:49:38.066071] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.081 [2024-07-15 13:49:38.074058] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:59.081 [2024-07-15 13:49:38.074075] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:59.081 [2024-07-15 13:49:38.074087] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.081 1+0 records in 00:30:59.081 1+0 records out 00:30:59.081 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284102 s, 14.4 MB/s 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.081 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.337 1+0 records in 00:30:59.337 1+0 records out 00:30:59.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339505 s, 12.1 MB/s 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.337 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.593 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.593 13:49:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.593 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.594 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.594 13:49:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.850 1+0 records in 00:30:59.850 1+0 records out 00:30:59.850 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003273 s, 12.5 MB/s 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.850 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:00.106 1+0 records in 00:31:00.106 1+0 records out 00:31:00.106 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352231 s, 11.6 MB/s 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:00.106 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd0", 00:31:00.363 "bdev_name": "crypto_ram" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd1", 00:31:00.363 "bdev_name": "crypto_ram2" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd2", 00:31:00.363 "bdev_name": "crypto_ram3" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd3", 00:31:00.363 "bdev_name": "crypto_ram4" 00:31:00.363 } 00:31:00.363 ]' 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd0", 00:31:00.363 "bdev_name": "crypto_ram" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd1", 00:31:00.363 "bdev_name": "crypto_ram2" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd2", 00:31:00.363 "bdev_name": "crypto_ram3" 00:31:00.363 }, 00:31:00.363 { 00:31:00.363 "nbd_device": "/dev/nbd3", 00:31:00.363 "bdev_name": "crypto_ram4" 00:31:00.363 } 00:31:00.363 ]' 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.363 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.620 13:49:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.877 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:01.134 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.392 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:01.650 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:01.650 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:01.650 13:49:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:01.650 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:01.916 /dev/nbd0 00:31:01.916 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:01.917 1+0 records in 00:31:01.917 1+0 records out 00:31:01.917 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284384 s, 14.4 MB/s 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:01.917 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:31:02.174 /dev/nbd1 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.174 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.432 1+0 records in 00:31:02.432 1+0 records out 00:31:02.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312316 s, 13.1 MB/s 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.432 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:31:02.432 /dev/nbd10 00:31:02.689 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:31:02.689 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:31:02.689 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.690 1+0 records in 00:31:02.690 1+0 records out 00:31:02.690 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030596 s, 13.4 MB/s 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.690 13:49:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:31:02.947 /dev/nbd11 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.947 1+0 records in 00:31:02.947 1+0 records out 00:31:02.947 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309935 s, 13.2 MB/s 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:02.947 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd0", 00:31:03.204 "bdev_name": "crypto_ram" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd1", 00:31:03.204 "bdev_name": "crypto_ram2" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd10", 00:31:03.204 "bdev_name": "crypto_ram3" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd11", 00:31:03.204 "bdev_name": "crypto_ram4" 00:31:03.204 } 00:31:03.204 ]' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd0", 00:31:03.204 "bdev_name": "crypto_ram" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd1", 00:31:03.204 "bdev_name": "crypto_ram2" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd10", 00:31:03.204 "bdev_name": "crypto_ram3" 00:31:03.204 }, 00:31:03.204 { 00:31:03.204 "nbd_device": "/dev/nbd11", 00:31:03.204 "bdev_name": "crypto_ram4" 00:31:03.204 } 00:31:03.204 ]' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:03.204 /dev/nbd1 00:31:03.204 /dev/nbd10 00:31:03.204 /dev/nbd11' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:03.204 /dev/nbd1 00:31:03.204 /dev/nbd10 00:31:03.204 /dev/nbd11' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:03.204 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:03.205 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:03.205 256+0 records in 00:31:03.205 256+0 records out 00:31:03.205 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114529 s, 91.6 MB/s 00:31:03.205 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:03.205 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:03.205 256+0 records in 00:31:03.205 256+0 records out 00:31:03.205 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0613222 s, 17.1 MB/s 00:31:03.205 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:03.205 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:03.461 256+0 records in 00:31:03.461 256+0 records out 00:31:03.461 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0657933 s, 15.9 MB/s 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:31:03.461 256+0 records in 00:31:03.461 256+0 records out 00:31:03.461 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0602886 s, 17.4 MB/s 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:31:03.461 256+0 records in 00:31:03.461 256+0 records out 00:31:03.461 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.035699 s, 29.4 MB/s 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:31:03.461 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:03.462 13:49:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:03.718 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:03.982 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:04.239 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:04.240 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:04.496 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:04.753 13:49:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:04.753 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:05.011 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:05.285 malloc_lvol_verify 00:31:05.285 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:05.544 f977d213-17db-4c0e-87e7-c7505bd618ac 00:31:05.544 13:49:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:05.801 0d0fb9a2-3f4f-4bed-8019-869e1ac1ee08 00:31:05.801 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:06.058 /dev/nbd0 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:06.058 mke2fs 1.46.5 (30-Dec-2021) 00:31:06.058 Discarding device blocks: 0/4096 done 00:31:06.058 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:06.058 00:31:06.058 Allocating group tables: 0/1 done 00:31:06.058 Writing inode tables: 0/1 done 00:31:06.058 Creating journal (1024 blocks): done 00:31:06.058 Writing superblocks and filesystem accounting information: 0/1 done 00:31:06.058 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:06.058 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2249451 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2249451 ']' 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2249451 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2249451 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2249451' 00:31:06.316 killing process with pid 2249451 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2249451 00:31:06.316 13:49:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2249451 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:06.934 00:31:06.934 real 0m10.627s 00:31:06.934 user 0m13.952s 00:31:06.934 sys 0m4.211s 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:06.934 ************************************ 00:31:06.934 END TEST bdev_nbd 00:31:06.934 ************************************ 00:31:06.934 13:49:46 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:06.934 13:49:46 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:06.934 13:49:46 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:31:06.934 13:49:46 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:31:06.934 13:49:46 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:06.934 13:49:46 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:06.934 13:49:46 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:06.934 13:49:46 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:06.934 ************************************ 00:31:06.934 START TEST bdev_fio 00:31:06.934 ************************************ 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:06.934 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:06.934 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:06.935 ************************************ 00:31:06.935 START TEST bdev_fio_rw_verify 00:31:06.935 ************************************ 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:06.935 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:07.203 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:07.203 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:07.203 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:07.203 13:49:46 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:07.203 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:07.203 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:07.203 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:07.203 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:07.203 fio-3.35 00:31:07.203 Starting 4 threads 00:31:22.067 00:31:22.067 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2251489: Mon Jul 15 13:49:59 2024 00:31:22.067 read: IOPS=18.1k, BW=70.5MiB/s (73.9MB/s)(705MiB/10001msec) 00:31:22.067 slat (usec): min=16, max=1525, avg=72.15, stdev=37.48 00:31:22.067 clat (usec): min=13, max=2586, avg=393.37, stdev=238.33 00:31:22.067 lat (usec): min=70, max=2656, avg=465.52, stdev=258.67 00:31:22.067 clat percentiles (usec): 00:31:22.067 | 50.000th=[ 334], 99.000th=[ 1090], 99.900th=[ 1237], 99.990th=[ 1385], 00:31:22.067 | 99.999th=[ 2474] 00:31:22.067 write: IOPS=19.9k, BW=77.9MiB/s (81.7MB/s)(760MiB/9755msec); 0 zone resets 00:31:22.067 slat (usec): min=25, max=591, avg=88.20, stdev=36.07 00:31:22.067 clat (usec): min=49, max=2601, avg=484.06, stdev=283.24 00:31:22.067 lat (usec): min=91, max=2683, avg=572.25, stdev=301.67 00:31:22.067 clat percentiles (usec): 00:31:22.067 | 50.000th=[ 429], 99.000th=[ 1319], 99.900th=[ 1516], 99.990th=[ 1647], 00:31:22.067 | 99.999th=[ 2212] 00:31:22.068 bw ( KiB/s): min=63432, max=101720, per=97.47%, avg=77771.79, stdev=2471.64, samples=76 00:31:22.068 iops : min=15858, max=25430, avg=19442.95, stdev=617.91, samples=76 00:31:22.068 lat (usec) : 20=0.01%, 50=0.01%, 100=3.77%, 250=23.68%, 500=38.57% 00:31:22.068 lat (usec) : 750=19.98%, 1000=10.09% 00:31:22.068 lat (msec) : 2=3.90%, 4=0.01% 00:31:22.068 cpu : usr=99.52%, sys=0.00%, ctx=77, majf=0, minf=277 00:31:22.068 IO depths : 1=9.9%, 2=25.6%, 4=51.3%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:22.068 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:22.068 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:22.068 issued rwts: total=180520,194581,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:22.068 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:22.068 00:31:22.068 Run status group 0 (all jobs): 00:31:22.068 READ: bw=70.5MiB/s (73.9MB/s), 70.5MiB/s-70.5MiB/s (73.9MB/s-73.9MB/s), io=705MiB (739MB), run=10001-10001msec 00:31:22.068 WRITE: bw=77.9MiB/s (81.7MB/s), 77.9MiB/s-77.9MiB/s (81.7MB/s-81.7MB/s), io=760MiB (797MB), run=9755-9755msec 00:31:22.068 00:31:22.068 real 0m13.563s 00:31:22.068 user 0m45.688s 00:31:22.068 sys 0m0.505s 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:22.068 ************************************ 00:31:22.068 END TEST bdev_fio_rw_verify 00:31:22.068 ************************************ 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "23c2f3d9-268b-5beb-9573-e50f7f7e58ce"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "23c2f3d9-268b-5beb-9573-e50f7f7e58ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ebeb8367-26f1-539f-b2dc-7d9f695b2e61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ebeb8367-26f1-539f-b2dc-7d9f695b2e61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "153f85f6-67d3-5090-9a6d-385c904b368c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "153f85f6-67d3-5090-9a6d-385c904b368c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:22.068 crypto_ram2 00:31:22.068 crypto_ram3 00:31:22.068 crypto_ram4 ]] 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "23c2f3d9-268b-5beb-9573-e50f7f7e58ce"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "23c2f3d9-268b-5beb-9573-e50f7f7e58ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "804e7c9d-90eb-5ed2-b0ec-fec50ae85dac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ebeb8367-26f1-539f-b2dc-7d9f695b2e61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ebeb8367-26f1-539f-b2dc-7d9f695b2e61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "153f85f6-67d3-5090-9a6d-385c904b368c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "153f85f6-67d3-5090-9a6d-385c904b368c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:22.068 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:22.069 ************************************ 00:31:22.069 START TEST bdev_fio_trim 00:31:22.069 ************************************ 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:22.069 13:49:59 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:22.069 13:50:00 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:22.069 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.069 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.069 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.069 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:22.069 fio-3.35 00:31:22.069 Starting 4 threads 00:31:34.272 00:31:34.272 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2253345: Mon Jul 15 13:50:13 2024 00:31:34.272 write: IOPS=31.8k, BW=124MiB/s (130MB/s)(1240MiB/10001msec); 0 zone resets 00:31:34.272 slat (usec): min=17, max=1454, avg=72.60, stdev=26.54 00:31:34.272 clat (usec): min=42, max=1924, avg=318.40, stdev=148.17 00:31:34.272 lat (usec): min=80, max=1994, avg=391.01, stdev=158.50 00:31:34.272 clat percentiles (usec): 00:31:34.272 | 50.000th=[ 289], 99.000th=[ 676], 99.900th=[ 758], 99.990th=[ 1057], 00:31:34.272 | 99.999th=[ 1647] 00:31:34.272 bw ( KiB/s): min=105856, max=208554, per=100.00%, avg=127290.32, stdev=6976.39, samples=76 00:31:34.272 iops : min=26462, max=52139, avg=31822.37, stdev=1744.02, samples=76 00:31:34.272 trim: IOPS=31.8k, BW=124MiB/s (130MB/s)(1240MiB/10001msec); 0 zone resets 00:31:34.272 slat (usec): min=4, max=443, avg=20.09, stdev=11.20 00:31:34.272 clat (usec): min=11, max=1744, avg=300.99, stdev=168.31 00:31:34.272 lat (usec): min=49, max=1762, avg=321.09, stdev=175.48 00:31:34.272 clat percentiles (usec): 00:31:34.272 | 50.000th=[ 265], 99.000th=[ 766], 99.900th=[ 832], 99.990th=[ 889], 00:31:34.272 | 99.999th=[ 1090] 00:31:34.272 bw ( KiB/s): min=105840, max=208586, per=100.00%, avg=127292.42, stdev=6976.61, samples=76 00:31:34.272 iops : min=26460, max=52145, avg=31822.89, stdev=1744.05, samples=76 00:31:34.272 lat (usec) : 20=0.01%, 50=0.18%, 100=4.35%, 250=38.65%, 500=42.65% 00:31:34.272 lat (usec) : 750=13.46%, 1000=0.71% 00:31:34.272 lat (msec) : 2=0.01% 00:31:34.272 cpu : usr=99.62%, sys=0.00%, ctx=60, majf=0, minf=118 00:31:34.272 IO depths : 1=5.2%, 2=27.1%, 4=54.2%, 8=13.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:34.272 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:34.272 complete : 0=0.0%, 4=88.1%, 8=11.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:34.272 issued rwts: total=0,317553,317554,0 short=0,0,0,0 dropped=0,0,0,0 00:31:34.272 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:34.272 00:31:34.272 Run status group 0 (all jobs): 00:31:34.272 WRITE: bw=124MiB/s (130MB/s), 124MiB/s-124MiB/s (130MB/s-130MB/s), io=1240MiB (1301MB), run=10001-10001msec 00:31:34.272 TRIM: bw=124MiB/s (130MB/s), 124MiB/s-124MiB/s (130MB/s-130MB/s), io=1240MiB (1301MB), run=10001-10001msec 00:31:34.272 00:31:34.272 real 0m13.529s 00:31:34.272 user 0m45.786s 00:31:34.272 sys 0m0.495s 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:34.272 ************************************ 00:31:34.272 END TEST bdev_fio_trim 00:31:34.272 ************************************ 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:34.272 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:34.272 00:31:34.272 real 0m27.437s 00:31:34.272 user 1m31.643s 00:31:34.272 sys 0m1.197s 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:34.272 ************************************ 00:31:34.272 END TEST bdev_fio 00:31:34.272 ************************************ 00:31:34.272 13:50:13 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:34.272 13:50:13 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:34.272 13:50:13 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:34.272 13:50:13 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:34.272 13:50:13 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:34.272 13:50:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:34.272 ************************************ 00:31:34.272 START TEST bdev_verify 00:31:34.272 ************************************ 00:31:34.272 13:50:13 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:34.530 [2024-07-15 13:50:13.725143] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:34.530 [2024-07-15 13:50:13.725210] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2254761 ] 00:31:34.530 [2024-07-15 13:50:13.855063] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:34.787 [2024-07-15 13:50:13.964510] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:34.787 [2024-07-15 13:50:13.964515] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.787 [2024-07-15 13:50:13.985884] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:34.787 [2024-07-15 13:50:13.993912] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:34.787 [2024-07-15 13:50:14.001948] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:34.787 [2024-07-15 13:50:14.105646] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:37.334 [2024-07-15 13:50:16.324592] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:37.334 [2024-07-15 13:50:16.324677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:37.334 [2024-07-15 13:50:16.324693] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:37.334 [2024-07-15 13:50:16.332608] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:37.334 [2024-07-15 13:50:16.332628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:37.334 [2024-07-15 13:50:16.332641] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:37.334 [2024-07-15 13:50:16.340633] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:37.334 [2024-07-15 13:50:16.340651] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:37.334 [2024-07-15 13:50:16.340663] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:37.334 [2024-07-15 13:50:16.348657] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:37.334 [2024-07-15 13:50:16.348674] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:37.334 [2024-07-15 13:50:16.348686] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:37.334 Running I/O for 5 seconds... 00:31:42.593 00:31:42.593 Latency(us) 00:31:42.593 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:42.593 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x0 length 0x1000 00:31:42.593 crypto_ram : 5.07 499.84 1.95 0.00 0.00 255069.96 4359.57 167772.16 00:31:42.593 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x1000 length 0x1000 00:31:42.593 crypto_ram : 5.07 504.91 1.97 0.00 0.00 252960.85 4701.50 166860.35 00:31:42.593 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x0 length 0x1000 00:31:42.593 crypto_ram2 : 5.07 502.70 1.96 0.00 0.00 253149.37 5841.25 155918.69 00:31:42.593 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x1000 length 0x1000 00:31:42.593 crypto_ram2 : 5.07 504.81 1.97 0.00 0.00 252188.18 4872.46 155006.89 00:31:42.593 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x0 length 0x1000 00:31:42.593 crypto_ram3 : 5.06 3872.20 15.13 0.00 0.00 32766.27 5043.42 25644.52 00:31:42.593 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x1000 length 0x1000 00:31:42.593 crypto_ram3 : 5.05 3901.48 15.24 0.00 0.00 32516.87 7750.34 25530.55 00:31:42.593 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x0 length 0x1000 00:31:42.593 crypto_ram4 : 5.06 3872.73 15.13 0.00 0.00 32677.76 5071.92 25758.50 00:31:42.593 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:42.593 Verification LBA range: start 0x1000 length 0x1000 00:31:42.593 crypto_ram4 : 5.06 3908.53 15.27 0.00 0.00 32373.30 1381.95 25530.55 00:31:42.593 =================================================================================================================== 00:31:42.593 Total : 17567.20 68.62 0.00 0.00 57941.15 1381.95 167772.16 00:31:42.593 00:31:42.593 real 0m8.266s 00:31:42.593 user 0m15.670s 00:31:42.593 sys 0m0.371s 00:31:42.593 13:50:21 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:42.593 13:50:21 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:42.593 ************************************ 00:31:42.593 END TEST bdev_verify 00:31:42.593 ************************************ 00:31:42.593 13:50:21 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:42.593 13:50:21 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:42.593 13:50:21 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:42.593 13:50:21 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:42.593 13:50:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:42.593 ************************************ 00:31:42.593 START TEST bdev_verify_big_io 00:31:42.593 ************************************ 00:31:42.593 13:50:21 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:42.850 [2024-07-15 13:50:22.056609] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:42.850 [2024-07-15 13:50:22.056668] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2255823 ] 00:31:42.850 [2024-07-15 13:50:22.183383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:43.107 [2024-07-15 13:50:22.281332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:43.107 [2024-07-15 13:50:22.281338] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:43.107 [2024-07-15 13:50:22.302682] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:43.107 [2024-07-15 13:50:22.310711] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:43.107 [2024-07-15 13:50:22.318740] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:43.107 [2024-07-15 13:50:22.417935] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:45.626 [2024-07-15 13:50:24.641389] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:45.626 [2024-07-15 13:50:24.641472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:45.626 [2024-07-15 13:50:24.641487] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.626 [2024-07-15 13:50:24.649422] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:45.626 [2024-07-15 13:50:24.649442] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:45.626 [2024-07-15 13:50:24.649454] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.626 [2024-07-15 13:50:24.657430] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:45.626 [2024-07-15 13:50:24.657448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:45.627 [2024-07-15 13:50:24.657459] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.627 [2024-07-15 13:50:24.665454] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:45.627 [2024-07-15 13:50:24.665471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:45.627 [2024-07-15 13:50:24.665483] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.627 Running I/O for 5 seconds... 00:31:48.201 [2024-07-15 13:50:27.489030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.490697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.492323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.493574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.496306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.497355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.497743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.498149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.499848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.501452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.503064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.503500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.506198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.506605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.506994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.507770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.509747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.511364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.512701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.513801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.516186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.516583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.516969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.518438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.520465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.522095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.522682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.524379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.525935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.526333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.527021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.528372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.530348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.531766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.532814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.534182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.535640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.536042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.537486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.538837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.540819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.541446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.543157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.544834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.546464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.547099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.548458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.550072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.551935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.552967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.554319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.555944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.557444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.558669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.560011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.561642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.562871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.564526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.565984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.567559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.569517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.570895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.572515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.574132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.575362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.201 [2024-07-15 13:50:27.576732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.578356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.579972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.582515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.583870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.585481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.587095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.589214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.590773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.592419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.594146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.597504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.599262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.600933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.602527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.604327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.605952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.607574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.608808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.611414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.613029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.614644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.615527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.617219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.618834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.620457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.620963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.624119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.202 [2024-07-15 13:50:27.625836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.460 [2024-07-15 13:50:27.627604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.460 [2024-07-15 13:50:27.628228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.630268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.631903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.633348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.633734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.636504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.638122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.639200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.640556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.642584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.644214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.644917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.645308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.648422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.649262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.650876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.652316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.654301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.654727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.655118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.655702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.657192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.658650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.660296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.661985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.662833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.663290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.664789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.666255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.667812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.668219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.668269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.669128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.670742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.671646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.671697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.673192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.674575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.675863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.675912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.677383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.677914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.679554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.679610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.681405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.682917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.684574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.684625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.686101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.686561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.688065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.688115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.689290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.690615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.691859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.691911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.692594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.693007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.694443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.694493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.694883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.696376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.698071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.698130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.698740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.699164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.700712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.700762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.701156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.703737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.703795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.704979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.705025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.706439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.706496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.706893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.706942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.708955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.709012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.710691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.710733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.711482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.711539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.711919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.711971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.713788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.713847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.715095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.715143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.715885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.715946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.716328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.716370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.718929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.718987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.720379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.720425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.721355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.721410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.722086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.722132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.461 [2024-07-15 13:50:27.724805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.724864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.726275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.726323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.727181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.727235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.728675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.728720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.731091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.731147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.731845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.731898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.732854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.732910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.734328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.734379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.737388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.737452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.737840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.737882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.739294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.739350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.740581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.740630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.743270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.743328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.743710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.743754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.745247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.745308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.746790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.746838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.748773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.748830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.749223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.749266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.750074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.750136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.750533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.750582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.752426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.752483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.752868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.752910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.752936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.753318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.753815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.753882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.754278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.754328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.754347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.754733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.755867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.756282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.756334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.756719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.757116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.757275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.757673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.757732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.758134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.758523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.759776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.759833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.759885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.759946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.760999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.762859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.763243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.764958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.765003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.765048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.765366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.766524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.766576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.766618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.766659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.767857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.769005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.462 [2024-07-15 13:50:27.769071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.769732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.770048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.771854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.772349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.773766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.773823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.773881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.773943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.774382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.774537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.774594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.774636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.774678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.775082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.776747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.777153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.778340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.778403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.778444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.778499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.778888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.779056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.779103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.779145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.779187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.779618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.780677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.780729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.780771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.780813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.781906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.782778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.782837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.782879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.782931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.783773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.784991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.463 [2024-07-15 13:50:27.785955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.947444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.948440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.950180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.951726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.953243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.953711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.955192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.956863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.958852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.959686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.961031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.962386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.963954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.965167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.966531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.967857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.969432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.970968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.972325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.973697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.975489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.977127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.978869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.980553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.981622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.982991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.984371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.985986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.988249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.989608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.990979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.992589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.994504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.995909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.997292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:27.998901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.002184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.003963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.005546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.007218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.009040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.010510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.012139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.013700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.016267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.017640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.019257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.020451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.022163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.023544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.025166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.025993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.029079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.030481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.032088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.032711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.034790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.036580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.038301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.038689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.041249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.042884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.042937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.044021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.045746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.047115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.047163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.048759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.050364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.051702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.051749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.053082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.053492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.054614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.054661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.056006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.057286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.057692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.057737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.059183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.059635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.060310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.060367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.062112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.063507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.063906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.063963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.065326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.065735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.066488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.066539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.067758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.069007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.070133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.070184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.071423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.071883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.073376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.073426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.721 [2024-07-15 13:50:28.074823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.076424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.078229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.078283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.079979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.080482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.081876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.081933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.083387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.084694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.085689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.085740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.086676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.087086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.088229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.088279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.088849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.090243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.091708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.091758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.092391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.092799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.094544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.094604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.094994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.096323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.097570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.097620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.099151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.099660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.100532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.100583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.100976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.102289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.102803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.102853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.104199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.104614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.105028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.105077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.105461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.106654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.107890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.107946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.108946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.109354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.109760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.109805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.110280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.111518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.113105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.113165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.114828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.115338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.115730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.115776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.117010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.118311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.119364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.119413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.120498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.121123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.121520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.121568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.123041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.124236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.125645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.125695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.126091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.126568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.127653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.127701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.128693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.129867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.131195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.131246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.131630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.132256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.134024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.134070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.135530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.136824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.137461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.137512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.137900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.138352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.139782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.139838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.141340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.142584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.143104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.143152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.143535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.144025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.144429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.144485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.722 [2024-07-15 13:50:28.144879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.146461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.146860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.146924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.147315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.147892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.148299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.148361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.148750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.150537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.150945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.150996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.151393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.152011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.152413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.152463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.152855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.154521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.983 [2024-07-15 13:50:28.154932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.154983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.155367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.156005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.156408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.156456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.156863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.158415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.158815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.158865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.159260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.159861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.160265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.160316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.160705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.162103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.162498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.162548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.162937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.163451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.163846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.163896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.164289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.165530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.165937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.165994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.166379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.166402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.166730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.166882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.167287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.167343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.167733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.167756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.168169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.169386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.169467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.169859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.169904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.170198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:48.984 [2024-07-15 13:50:28.170705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:48.984 [2024-07-15 13:50:28.170779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:48.984 [2024-07-15 13:50:28.171184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:48.984 [2024-07-15 13:50:28.171247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:48.984 [2024-07-15 13:50:28.172587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.172640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.172682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.172724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.173032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.173188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.173234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.173276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.173325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.174856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.174914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.174997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.175586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.176754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.176805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.176862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.176904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.177326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.177478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.177523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.177565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.177607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.178722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.178775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.178817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.178859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.179127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.179282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.179335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.179376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.179431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.180885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.180948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.180992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.181034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.181296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.984 [2024-07-15 13:50:28.181449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.181498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.181546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.181588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.182829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.182898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.182957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.183547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.185789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.186963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.187764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.188948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.190250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.190297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.191404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.191722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.191878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.192280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.192329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.192718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.193982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.195773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.195834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.197444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.197843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.198010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.198683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.198729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.200413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.201782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.202727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.202772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.204064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.204330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.204486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.205029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.205076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.205459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.206653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.207063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.207115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.208645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.208917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.209078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.209471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.209515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.209911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.211076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.212358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.212408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.213885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.214321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.214478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.215699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.215745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.216929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.218139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.219574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.219621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.220434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.220699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.985 [2024-07-15 13:50:28.220852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.221886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.221939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.223610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.225138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.226634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.226683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.228312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.228704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.228859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.230609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.230670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.232273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.233607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.234439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.234487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.235389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.235655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.235818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.236697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.236744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.238184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.239261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.239663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.239710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.241171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.241579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.241737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.243520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.243578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.245200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.246452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.247388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.247437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.248235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.248504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.248658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.249084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.249130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.250437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.254217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.255940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.255987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.256659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.256924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.257086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.257490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.257538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.259152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.260386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.261775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.261826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.263249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.263625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.263777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.265531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.265597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.265997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.267235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.268656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.268704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.270098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.270379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.270531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.270923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.270980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.272549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.275938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.277055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.277116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.278730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.279179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.279336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.280611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.280657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.281496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.286144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.287516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.287562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.288992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.289268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.986 [2024-07-15 13:50:28.289421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.290363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.290412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.291584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.295876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.297523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.297571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.298930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.299196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.299350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.300723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.300771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.302139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.304642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.305277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.305326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.306677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.306948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.307105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.308730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.308778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.309795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.313961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.315492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.315538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.316395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.316662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.316816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.317767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.317819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.319160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.322924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.324305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.324352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.325969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.326311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.326464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.327724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.327771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.328915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.332412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.333845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.333893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.335360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.335670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.335821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.337207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.337256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.338862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.343125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.344496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.344544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.345924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.346194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.346349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.347339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.347388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.349053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.352694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.353497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.353545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.354900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.355202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.355354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.356997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.357044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.357889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.362974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.364320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.364369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.364408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.368619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.370005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.370158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.370301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.371234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.373830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.375195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.987 [2024-07-15 13:50:28.376561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.378164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.378437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.378492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.379692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.381055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.382430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.385877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.387453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.388830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.390202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.390473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.391297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.392863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.394574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.396278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.399708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.400965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.402420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.404073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.988 [2024-07-15 13:50:28.404347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.405305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.406669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.408063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.409692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.412250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.412669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.414152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.415480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.415800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.417539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.418383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.420132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.421789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.424403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.425239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.426330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.427219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.427575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.429072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.430693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.431988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.433402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.437677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.439365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.439752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.441463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.441738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.443324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.444953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.445654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.447027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.451797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.452497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.453708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.455074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.455386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.457128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.458081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.459766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.461274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.466045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.467195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.468200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.246 [2024-07-15 13:50:28.469514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.469786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.470879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.472193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.473754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.474156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.478056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.479068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.480195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.480248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.480520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.481077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.482532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.482986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.483034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.488092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.488150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.489347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.489393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.489683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.490626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.490683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.491971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.492018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.496912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.496993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.497379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.497428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.497700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.498208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.498266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.499718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.499773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.505932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.506015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.507085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.507132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.507523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.509070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.509125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.509518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.509561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.514047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.514109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.515686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.515742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.516084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.517395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.517448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.518038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.518087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.523668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.523727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.524494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.524539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.524810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.525343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.525398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.526813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.526862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.530724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.530781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.531180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.531243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.531515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.533277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.533342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.533940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.533990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.538900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.538964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.539932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.539982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.540308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.541523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.541579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.543001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.543050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.546042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.546108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.547826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.547869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.548148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.548643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.548697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.550031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.550080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.554041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.554109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.555356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.555404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.555712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.557011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.557067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.558310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.558370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.564667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.564731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.566465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.566524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.566908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.568371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.568426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.569797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.569843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.573460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.573519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.574518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.574566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.574902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.576271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.576326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.577103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.577151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.578724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.578783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.580274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.580335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.580679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.582300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.582362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.247 [2024-07-15 13:50:28.583432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.583478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.588445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.588505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.589762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.589814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.590199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.591471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.591535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.593261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.593307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.599756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.599822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.600223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.600274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.600545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.601146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.601202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.602408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.602452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.607735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.607792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.608189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.608245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.608611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.609118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.609173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.610531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.610576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.615261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.615325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.615916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.615974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.616248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.616742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.616801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.617208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.617263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.622135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.622202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.622590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.622638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.622909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.623645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.623700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.624773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.624818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.630001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.630058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.630446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.630500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.630953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.631449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.631518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.633141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.633185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.638029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.638087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.638476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.638522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.638793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.639297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.639353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.639746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.639800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.645605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.645668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.646068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.646131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.646402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.646898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.646961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.648334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.648378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.653520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.653585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.653983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.654033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.654501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.655013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.655072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.656442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.656488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.661105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.661164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.661816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.661863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.662142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.662642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.662697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.663115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.663165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.668395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.669103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.670246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.248 [2024-07-15 13:50:28.670293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.670659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.672187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.672984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.674135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.674190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.678947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.679013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.679057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.679098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.679367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.680057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.680118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.680165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.680205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.683763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.688985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.689028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.689077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.692625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.692683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.692732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.692774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.693157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.509 [2024-07-15 13:50:28.693308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.693354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.693394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.693435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.697919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.702690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.705590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.705650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.705691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.705736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.706064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.706217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.706262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.706302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.706343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.709886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.709959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.710605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.713944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.713997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.714038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.715432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.715820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.715984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.716030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.716070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.716759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.718706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.720344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.720401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.721232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.721561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.721717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.723144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.723195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.724736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.728943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.729843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.729894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.731625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.731941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.732097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.732785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.732833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.733976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.737788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.738644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.738695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.739698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.739980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.740136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.741648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.741695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.742098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.744611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.510 [2024-07-15 13:50:28.746405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.746460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.747899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.748256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.748411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.749636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.749681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.750762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.754532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.755555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.755605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.756950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.757222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.757376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.757772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.757832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.759504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.763557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.764772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.764823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.765512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.765785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.765947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.766909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.766961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.767793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.770775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.772273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.772325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.773763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.774179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.774332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.776142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.776195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.776582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.780163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.781954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.782000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.783702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.784129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.784290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.785109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.785157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.786687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.790688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.791942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.791992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.793014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.793292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.793443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.794734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.794781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.795242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.798480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.799679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.799729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.801484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.801959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.802114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.803623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.803671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.804365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.808989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.810247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.810294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.811741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.511 [2024-07-15 13:50:28.812018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.812166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.813105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.813153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.814588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.819123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.820929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.820979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.822469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.822747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.822901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.824262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.824309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.825682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.828145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.828645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.828693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.830052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.830326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.830475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.832100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.832146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.833328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.837476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.839191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.839239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.839888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.840169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.840321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.841144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.841192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.842555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.846557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.847960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.848009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.849627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.849994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.850146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.851251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.851297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.852523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.856150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.857282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.857328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.859064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.859357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.859507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.861071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.861121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.862858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.866301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.867648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.867695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.869078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.869351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.869506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.870257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.870305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.871755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.875178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.876265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.876314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.876980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.877252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.877407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.878788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.878835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.880212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.883710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.885383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.885439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.886604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.886958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.887115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.888656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.888705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.889101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.893300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.894969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.895019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.896657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.896931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.897089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.512 [2024-07-15 13:50:28.898678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.898725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.900172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.904088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.905458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.905506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.907120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.907476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.907626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.909008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.909055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.910439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.914511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.915155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.915203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.916741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.917061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.917211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.918593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.918641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.920262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.924523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.924582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.924625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.926097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.926528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.926678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.926737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.926779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.513 [2024-07-15 13:50:28.928228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.932592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.933220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.934583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.936103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.936377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.936526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.937986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.939394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.939936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.944692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.945815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.947358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.948717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.949056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.950791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.951598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.952949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.953745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.958749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.959636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.960994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.962343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.962616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.964126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.965637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.966073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.967752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.972865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.974530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.976017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.977382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.977655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.978570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.979877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.980720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.981823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.987354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.772 [2024-07-15 13:50:28.988872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.990694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.991781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.992142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.993473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.994155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.995097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:28.996358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.000670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.002078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.002616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.003670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.003986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.005314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.006460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.007716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.008776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.012112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.013196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.014401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.015649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.015970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.017886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.018281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.019853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.020249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.024917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.026599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.027066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.028547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.028947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.030618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.032022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.032570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.034186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.036930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.038565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.040021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.040073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.040421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.041935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.043613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.044475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.044523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.049099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.049162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.050190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.050236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.050599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.052012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.052068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.053768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.053822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.059809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.059870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.061496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.061567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.061841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.062672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.062727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.063725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.063773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.067711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.067769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.068760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.068809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.069085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.070732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.070798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.071194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.071241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.075747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.075803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.077336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.077384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.077740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.078984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.079036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.080252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.080298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.085260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.085319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.086517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.086566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.086845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.087352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.087409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.088937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.088981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.093873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.093935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.094609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.094656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.094932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.095919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.095979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.096812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.096858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.101458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.101522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.103006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.103052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.103504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.105407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.105463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.105848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.105894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.110894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.110960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.112224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.112268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.112551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.113603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.113658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.114464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.114515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.118042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.118100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.118769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.118814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.119100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.120211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.120280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.121904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.121956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.128266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.128323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.129144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.129191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.129515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.130194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.130251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.131537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.131587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.135835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.135892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.137485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.137533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.138002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.138500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.138554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.138951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.139013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.141360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.141419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.142847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.142894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.143350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.145180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.145241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.145627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.145681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.148522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.148578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.148977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.149028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.149428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.150950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.151006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.151395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.773 [2024-07-15 13:50:29.151452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.153907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.153976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.154366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.154415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.154732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.155237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.155296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.155681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.155730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.158435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.158496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.158888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.158951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.159404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.159901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.159984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.160376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.160431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.162954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.163012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.163399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.163447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.163758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.164257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.164319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.164714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.164762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.167426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.167486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.167883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.167939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.168334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.168833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.168904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.169306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.169375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.172402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.172461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.173534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.173583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.173857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.175039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.175093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.175719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.175767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.180284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.180347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.181824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.181872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.182236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.182733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.182787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.183708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.183755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.188191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.188249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.188637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.188688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.188970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.190075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.190134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.191076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.191126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.195097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.195155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.196117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.774 [2024-07-15 13:50:29.196165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.196509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.198373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.198435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.199778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.199825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.203858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.203922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.205693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.205746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.206031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.206528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.206581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.206974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.207018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.211589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.212887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.213548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.213595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.213867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.214371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.215623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.216864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.216912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.220916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.220985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.221027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.221067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.221343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.035 [2024-07-15 13:50:29.221841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.221895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.221944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.221985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.226944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.228953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.229009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.230966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.231012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.231053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.231099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.232994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.233041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.233101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.233148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.234716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.234768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.234822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.234867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.235143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.235296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.235342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.235383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.235425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.236794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.236845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.236886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.236934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.237269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.237420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.237465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.237506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.237551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.238940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.239638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.241617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.241667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.241715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.243247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.243523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.243676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.243723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.243765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.036 [2024-07-15 13:50:29.244745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.246106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.246504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.246562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.248209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.248552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.248709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.249116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.249168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.250651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.252007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.253288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.253340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.254684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.255000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.255155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.255886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.255940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.257356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.258671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.259076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.259125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.259652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.259932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.260088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.260631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.260680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.261880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.263456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.264594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.264642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.265235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.265560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.265714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.266124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.266173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.267534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.268785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.270143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.270191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.270577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.271043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.271200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.272636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.272684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.274103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.275326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.276702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.276750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.278108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.278404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.278557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.279243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.279291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.279674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.280884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.282268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.282315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.283438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.283713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.283866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.285671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.285725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.287493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.288755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.290134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.290183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.291583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.291907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.292063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.293452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.293500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.294373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.295550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.297246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.297293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.298878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.299351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.299510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.301131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.037 [2024-07-15 13:50:29.301179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.301782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.306179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.307552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.307599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.308974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.309291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.309445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.310569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.310618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.311957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.316594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.318320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.318370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.320005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.320282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.320435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.321794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.321842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.323199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.324730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.325448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.325497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.326852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.327155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.327307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.328687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.328747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.329814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.331066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.332446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.332493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.332917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.333268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.333420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.334658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.334705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.336061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.337233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.339015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.339070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.340863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.341143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.341295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.342920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.342980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.343370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.344699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.346065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.346114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.347479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.347793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.347954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.349453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.349505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.350971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.352198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.352599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.352644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.354043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.354382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.354529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.355902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.355954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.357323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.358540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.360275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.360329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.362114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.362502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.362658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.363062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.363107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.364790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.366064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.366954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.367003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.368387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.368660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.368813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.038 [2024-07-15 13:50:29.370402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.370453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.371989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.373331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.374689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.374742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.376125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.376465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.376616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.377494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.377542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.378875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.380038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.380437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.380481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.380872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.381151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.381306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.382917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.382975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.384624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.385861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.387319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.387367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.388787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.389068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.389218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.389610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.389667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.390059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.391286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.392662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.392709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.393560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.393850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.394009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.395379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.395425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.396791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.398176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.398231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.398277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.399761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.400063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.400212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.400269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.400312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.401317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.402576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.402983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.404184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.405176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.405450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.405601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.407000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.408004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.409057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.413072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.414079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.415126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.416924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.417397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.419193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.419584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.421318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.422598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.427128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.428894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.430171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.431471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.431813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.433632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.435089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.436459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.437099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.439513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.440210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.441963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.443268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.039 [2024-07-15 13:50:29.443578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.444900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.445873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.446851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.447460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.450251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.451776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.452185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.452574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.452908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.454288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.456084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.040 [2024-07-15 13:50:29.456815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.458063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.459818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.461114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.462906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.463656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.463971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.465820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.466218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.466604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.467560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.470070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.471790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.472183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.472570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.472898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.474026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.475453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.476549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.477571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.480021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.481042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.482341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.483504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.483874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.485288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.485681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.486072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.487604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.490502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.490906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.491298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.491347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.491673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.492844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.494377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.495422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.495471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.299 [2024-07-15 13:50:29.497194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.497249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.498714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.498763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.499195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.500807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.500862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.501264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.501314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.503727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.503785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.504562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.504608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.504945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.506601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.506662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.507060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.507104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.509837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.509893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.511103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.511147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.511475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.512952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.513006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.514354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.514399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.516940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.516997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.518585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.518631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.519037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.520405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.520469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.522072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.522125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.524777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.524845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.525517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.525567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.525915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.526415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.526467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.526853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.526897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.529412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.529471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.529865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.529913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.530251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.530749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.530809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.531213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.531268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.533324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.533382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.533775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.533824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.534239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.534740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.534795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.535202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.535256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.537624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.537696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.538094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.538150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.538631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.539135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.539196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.539590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.539646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.542362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.542418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.542801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.542844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.543214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.543707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.543762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.544153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.544197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.546032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.546090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.546472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.546515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.546869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.547374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.547432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.547818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.547861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.549683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.549741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.550137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.550181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.550567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.551981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.552036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.552803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.552850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.555715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.300 [2024-07-15 13:50:29.555777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.556996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.557045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.557389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.559121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.559188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.560835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.560888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.563348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.563407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.564877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.564931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.565205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.566333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.566388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.567399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.567455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.570244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.570300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.570885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.570939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.571218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.572868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.572935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.573330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.573376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.576174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.576232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.577450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.577496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.577867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.579096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.579152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.579536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.579580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.581750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.581814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.583546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.583598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.583871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.584370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.584424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.584805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.584849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.586655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.586713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.587967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.588015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.588294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.588788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.588844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.589244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.589299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.592078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.592142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.593664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.593712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.594107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.594603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.594657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.595519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.595568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.598023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.598081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.599241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.599296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.599773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.600275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.600337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.602084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.602136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.604694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.604752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.605146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.605190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.605541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.606329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.606385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.607641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.607690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.610252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.610308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.610709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.610753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.611190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.612837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.612894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.614387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.614435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.616277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.616332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.616716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.616758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.617082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.618455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.618509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.619766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.619811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.621391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.621800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.301 [2024-07-15 13:50:29.623097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.623146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.623461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.624340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.625923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.627221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.627268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.630476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.630539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.630602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.630658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.630935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.631436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.631489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.631530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.631570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.632838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.632890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.632953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.632995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.633436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.633590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.633635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.633675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.633716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.634903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.634962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.635615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.636889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.636947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.636989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.637858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.639993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.641984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.642033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.642075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.642116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.643980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.644022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.644064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.645452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.645515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.645556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.645602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.645992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.646140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.646186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.646230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.646273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.647496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.647559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.647612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.649309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.649588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.649751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.649806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.649848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.650246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.651670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.653130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.653178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.654786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.655148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.655306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.656589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.656636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.658091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.659294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.659689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.659734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.661391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.302 [2024-07-15 13:50:29.661722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.661875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.663366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.663418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.665215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.666437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.668044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.668092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.669383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.669831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.669997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.670388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.670436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.671943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.673106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.673912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.673966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.675320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.675635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.675785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.677410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.677458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.678287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.679850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.681541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.681586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.683355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.683628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.683780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.684951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.684998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.686347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.687531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.687934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.687985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.688591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.688864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.689026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.690421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.690469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.692089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.693283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.694671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.694719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.696333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.696682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.696834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.697233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.697278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.698389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.699581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.700786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.700833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.702513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.702825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.702983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.704514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.704560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.706338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.707608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.708951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.709000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.710366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.710637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.710789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.711560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.711609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.713036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.714180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.714580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.303 [2024-07-15 13:50:29.714624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.715011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.715286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.715439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.716804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.716851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.718205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.719403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.721103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.304 [2024-07-15 13:50:29.721159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.722853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.723135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.723290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.723682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.723727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.724116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.725333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.726958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.727006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.727646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.727920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.728085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.729621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.729670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.731279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.732523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.733879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.733935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.735321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.735642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.735792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.737420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.737469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.738295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.739459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.740964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.741013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.741398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.741880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.742040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.743747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.743793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.745388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.746634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.747980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.748029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.749406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.749682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.749833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.562 [2024-07-15 13:50:29.750761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.750810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.751199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.752430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.754174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.754231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.755740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.756029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.756182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.757547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.757596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.758982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.760533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.761052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.761102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.762312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.762588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.762744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.763701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.763750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.764754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.765978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.767342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.767391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.768391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.768701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.768858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.770675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.770720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.772164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.773956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.775420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.775476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.777276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.777672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.777828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.778814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.778873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.780153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.781443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.782467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.782518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.783437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.783709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.783866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.785143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.785193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.785656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.787034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.788603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.788656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.790382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.790706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.790859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.791863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.791913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.793534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.794888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.796141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.796192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.797424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.797761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.797922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.799592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.799650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.801242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.802676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.802740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.802781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.804018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.804293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.804443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.804494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.804536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.805767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.807054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.807452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.808899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.810165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.810559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.810716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.812453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.813923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.814370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.816995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.817486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.819306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.820825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.821185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.821680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.822167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.823602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.825292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.826857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.827263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.827808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.829185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.829462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.830183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.831458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.833051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.833439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.836340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.837086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.838343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.839807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.840221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.840712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.841786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.843052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.844165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.845715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.846121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.847345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.848595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.848942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.850294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.851557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.852457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.852842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.855385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.856103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.856497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.856883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.857165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.858331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.859170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.860948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.862094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.865453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.866821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.868408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.563 [2024-07-15 13:50:29.869019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.869297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.870853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.872460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.872856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.873249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.875672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.876367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.876756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.877388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.877665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.879309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.880948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.882442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.883545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.886122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.886519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.886919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.886987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.887271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.888684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.889275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.890882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.890948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.892734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.892800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.893205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.893265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.893629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.894139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.894191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.894582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.894635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.896414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.896473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.896880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.896943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.897304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.897797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.897848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.898253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.898307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.900135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.900197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.900593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.900649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.901032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.901527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.901579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.901975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.902031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.903826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.903890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.904294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.904348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.904752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.905257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.905310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.905699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.905774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.907586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.907651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.908052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.908123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.908486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.908990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.909044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.909431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.909488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.911395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.911479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.911866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.911915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.912322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.912816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.912866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.913265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.913321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.915166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.915221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.915603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.915647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.915969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.917335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.917391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.918581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.918627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.920232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.920293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.920685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.920734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.921022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.922562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.922618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.923020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.923069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.924642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.924702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.925578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.925629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.925969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.927223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.927277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.928678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.928726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.930768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.930832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.932564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.932608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.932881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.933407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.933463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.934752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.934802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.936333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.936389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.937818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.937869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.938236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.939735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.939793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.940183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.564 [2024-07-15 13:50:29.940244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.942625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.942683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.944434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.944477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.944816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.945431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.945487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.945873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.945915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.947789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.947846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.948856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.948906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.949188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.949683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.949735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.950131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.950179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.952978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.953044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.954398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.954447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.954864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.955367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.955422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.956249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.956301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.958779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.958839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.960327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.960375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.960762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.961269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.961325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.962668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.962716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.964931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.964991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.965940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.965996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.966486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.966995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.967055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.968567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.968623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.971203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.971260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.971649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.971698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.972009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.972956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.973011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.974009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.974058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.976726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.976784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.977178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.977232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.977689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.979315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.979374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.980617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.980668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.982752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.982812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.983211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.983261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.983683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.985309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.565 [2024-07-15 13:50:29.985371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.987094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.987154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.988828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.988886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.989287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.989336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.989612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.991077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.991132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.992384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.992433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.994830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.994887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.996222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.996268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.996705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.997210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.997265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.999063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:29.999107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.001370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.001428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.002054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.002106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.002477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.002985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.003047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.004752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.004825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.007666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.008077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.008471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.008520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.008797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.009808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.010787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.011743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.011799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.014691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.014768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.014810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.014853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.015206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.016787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.016842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.016886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.016952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.018417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.018481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.018523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.018566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.018840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.019004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.019058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.019107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.019148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.020891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.020953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.020996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.021668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.022896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.022958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.023775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.025995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.027518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.027683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.824 [2024-07-15 13:50:30.027796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.027846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.028165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.028323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.028371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.028415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.028457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.029958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.030072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.030158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.030248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.030809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.031148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.031321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.031416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.031490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.032699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.032757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.032800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.032841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.033156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.033312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.033359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.033401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.033446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.035050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.035103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.035146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.035975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.036315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.036472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.036520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.036561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.037939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.039163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.040764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.040822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.042479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.042788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.042957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.043357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.043409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.043799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.044991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.046369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.046418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.047417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.047693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.047853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.049537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.049586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.051237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.052506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.053437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.053487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.054813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.055132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.055291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.056626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.056674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.057606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.058803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.060164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.060213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.060644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.060956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.061114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.062207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.062256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.063578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.064790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.066332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.066381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.067894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.068209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.068367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.069724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.069774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.070169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.071393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.072752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.072801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.074164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.074443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.074601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.076361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.076408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.078138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.079393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.079791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.079841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.080782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.081108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.825 [2024-07-15 13:50:30.081265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.082629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.082677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.084060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.085265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.086644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.086692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.088062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.088377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.088534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.088941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.088991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.090133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.091332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.092744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.092793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.094490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.094801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.094961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.096446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.096494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.098012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.099351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.100711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.100759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.102119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.102430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.102586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.103690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.103743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.105312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.106581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.107041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.107093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.107480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.107758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.107910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.109273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.109321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.110692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.111955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.113685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.113751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.115503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.115776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.115943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.116337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.116386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.116775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.118040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.119177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.119228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.120233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.120524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.120679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.121084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.121135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.121523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.122856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.124404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.124462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.126199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.126609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.126765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.127166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.127211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.128529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.129892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.130908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.130964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.131768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.132144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.132300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.132835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.132884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.134089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.135322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.136997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.137045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.137430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.137884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.138045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.139367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.139417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.140785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.142178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.142579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.142628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.143022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.143372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.143530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.144555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.144605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.145965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.826 [2024-07-15 13:50:30.147204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.147602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.147647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.148052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.148327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.148484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.149757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.149805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.150275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.151600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.152006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.152051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.152953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.153288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.153444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.154817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.154866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.156228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.157571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.157981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.158032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.159761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.160131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.160302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.160827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.160877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.162278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.163708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.164411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.164460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.165497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.165771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.165937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.167063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.167113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.168112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.169344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.170684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.170731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.171729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.172051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.172205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.173977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.174031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.175461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.177398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.177458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.177499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.178788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.179069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.179222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.179286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.179340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.180552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.181852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.182261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.183510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.184513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.184799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.184958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.186487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.187476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.188423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.190859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.191828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.193459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.194542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.194876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.195386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.195784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.197524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.198394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.200918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.201327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.201721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.202745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.203066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.204035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.205056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.205459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.205859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.207996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.209763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.211353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.212772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.213130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.213625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.214030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.215376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.216730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.218993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.219393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.219844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.221336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.827 [2024-07-15 13:50:30.221611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.223384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.224948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.225812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.227169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.228717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.229126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.229517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.231236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.231512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.232019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.233505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.235121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:50.828 [2024-07-15 13:50:30.235513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:51.391 00:31:51.391 Latency(us) 00:31:51.391 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:51.391 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x0 length 0x100 00:31:51.391 crypto_ram : 5.89 43.50 2.72 0.00 0.00 2855560.24 71576.71 2742710.09 00:31:51.391 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x100 length 0x100 00:31:51.391 crypto_ram : 5.87 43.58 2.72 0.00 0.00 2848857.04 73400.32 2713532.33 00:31:51.391 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x0 length 0x100 00:31:51.391 crypto_ram2 : 5.89 43.49 2.72 0.00 0.00 2749523.70 71120.81 2742710.09 00:31:51.391 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x100 length 0x100 00:31:51.391 crypto_ram2 : 5.88 43.57 2.72 0.00 0.00 2742765.30 72944.42 2640587.91 00:31:51.391 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x0 length 0x100 00:31:51.391 crypto_ram3 : 5.61 261.82 16.36 0.00 0.00 434524.50 47869.77 616380.33 00:31:51.391 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x100 length 0x100 00:31:51.391 crypto_ram3 : 5.64 270.35 16.90 0.00 0.00 420944.54 65649.98 612733.11 00:31:51.391 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x0 length 0x100 00:31:51.391 crypto_ram4 : 5.70 276.27 17.27 0.00 0.00 400268.43 13050.21 506963.70 00:31:51.391 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:51.391 Verification LBA range: start 0x100 length 0x100 00:31:51.391 crypto_ram4 : 5.71 286.46 17.90 0.00 0.00 385928.38 19489.84 528847.03 00:31:51.391 =================================================================================================================== 00:31:51.391 Total : 1269.04 79.32 0.00 0.00 748107.17 13050.21 2742710.09 00:31:51.954 00:31:51.954 real 0m9.134s 00:31:51.954 user 0m17.350s 00:31:51.954 sys 0m0.419s 00:31:51.954 13:50:31 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:51.954 13:50:31 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:51.954 ************************************ 00:31:51.954 END TEST bdev_verify_big_io 00:31:51.954 ************************************ 00:31:51.954 13:50:31 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:51.954 13:50:31 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:51.954 13:50:31 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:51.954 13:50:31 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:51.954 13:50:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:51.954 ************************************ 00:31:51.954 START TEST bdev_write_zeroes 00:31:51.954 ************************************ 00:31:51.954 13:50:31 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:51.954 [2024-07-15 13:50:31.274961] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:51.954 [2024-07-15 13:50:31.275027] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2256924 ] 00:31:52.211 [2024-07-15 13:50:31.405498] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:52.211 [2024-07-15 13:50:31.508696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:52.211 [2024-07-15 13:50:31.529974] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:52.211 [2024-07-15 13:50:31.537995] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:52.211 [2024-07-15 13:50:31.546014] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:52.468 [2024-07-15 13:50:31.659396] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:54.993 [2024-07-15 13:50:33.878248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:54.993 [2024-07-15 13:50:33.878319] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:54.993 [2024-07-15 13:50:33.878334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:54.993 [2024-07-15 13:50:33.886267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:54.993 [2024-07-15 13:50:33.886286] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:54.993 [2024-07-15 13:50:33.886298] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:54.993 [2024-07-15 13:50:33.894287] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:54.993 [2024-07-15 13:50:33.894307] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:54.993 [2024-07-15 13:50:33.894319] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:54.993 [2024-07-15 13:50:33.902307] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:54.993 [2024-07-15 13:50:33.902325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:54.993 [2024-07-15 13:50:33.902336] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:54.993 Running I/O for 1 seconds... 00:31:55.923 00:31:55.923 Latency(us) 00:31:55.923 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:55.923 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:55.923 crypto_ram : 1.02 1981.73 7.74 0.00 0.00 64178.58 5499.33 77047.54 00:31:55.923 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:55.923 crypto_ram2 : 1.03 1987.45 7.76 0.00 0.00 63633.90 5442.34 72032.61 00:31:55.923 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:55.923 crypto_ram3 : 1.02 15186.99 59.32 0.00 0.00 8296.08 2478.97 10770.70 00:31:55.923 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:55.923 crypto_ram4 : 1.02 15224.15 59.47 0.00 0.00 8250.50 2464.72 8662.15 00:31:55.923 =================================================================================================================== 00:31:55.923 Total : 34380.33 134.30 0.00 0.00 14724.81 2464.72 77047.54 00:31:56.180 00:31:56.180 real 0m4.224s 00:31:56.180 user 0m3.797s 00:31:56.180 sys 0m0.378s 00:31:56.180 13:50:35 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:56.180 13:50:35 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:56.180 ************************************ 00:31:56.180 END TEST bdev_write_zeroes 00:31:56.180 ************************************ 00:31:56.180 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:56.180 13:50:35 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.180 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:56.180 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:56.180 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.180 ************************************ 00:31:56.180 START TEST bdev_json_nonenclosed 00:31:56.180 ************************************ 00:31:56.180 13:50:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.180 [2024-07-15 13:50:35.539302] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:56.180 [2024-07-15 13:50:35.539342] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257535 ] 00:31:56.437 [2024-07-15 13:50:35.648390] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:56.437 [2024-07-15 13:50:35.746289] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.437 [2024-07-15 13:50:35.746361] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:56.437 [2024-07-15 13:50:35.746381] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:56.437 [2024-07-15 13:50:35.746394] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:56.437 00:31:56.437 real 0m0.345s 00:31:56.437 user 0m0.217s 00:31:56.437 sys 0m0.126s 00:31:56.437 13:50:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:31:56.437 13:50:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:56.437 13:50:35 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:56.437 ************************************ 00:31:56.437 END TEST bdev_json_nonenclosed 00:31:56.437 ************************************ 00:31:56.694 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:31:56.694 13:50:35 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:31:56.694 13:50:35 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.694 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:56.694 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:56.694 13:50:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.694 ************************************ 00:31:56.694 START TEST bdev_json_nonarray 00:31:56.694 ************************************ 00:31:56.694 13:50:35 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.694 [2024-07-15 13:50:36.040214] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:56.694 [2024-07-15 13:50:36.040343] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257628 ] 00:31:56.951 [2024-07-15 13:50:36.236547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:56.951 [2024-07-15 13:50:36.341033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.951 [2024-07-15 13:50:36.341114] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:56.951 [2024-07-15 13:50:36.341135] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:56.951 [2024-07-15 13:50:36.341147] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:57.208 00:31:57.208 real 0m0.512s 00:31:57.208 user 0m0.282s 00:31:57.208 sys 0m0.226s 00:31:57.208 13:50:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:31:57.208 13:50:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:57.208 13:50:36 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:57.208 ************************************ 00:31:57.208 END TEST bdev_json_nonarray 00:31:57.208 ************************************ 00:31:57.208 13:50:36 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:31:57.208 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:31:57.209 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:31:57.209 13:50:36 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:31:57.209 00:31:57.209 real 1m12.439s 00:31:57.209 user 2m40.327s 00:31:57.209 sys 0m9.172s 00:31:57.209 13:50:36 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:57.209 13:50:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:57.209 ************************************ 00:31:57.209 END TEST blockdev_crypto_aesni 00:31:57.209 ************************************ 00:31:57.209 13:50:36 -- common/autotest_common.sh@1142 -- # return 0 00:31:57.209 13:50:36 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:57.209 13:50:36 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:57.209 13:50:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:57.209 13:50:36 -- common/autotest_common.sh@10 -- # set +x 00:31:57.209 ************************************ 00:31:57.209 START TEST blockdev_crypto_sw 00:31:57.209 ************************************ 00:31:57.209 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:57.465 * Looking for test storage... 00:31:57.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:57.465 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2257708 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2257708 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2257708 ']' 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:57.466 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:57.466 13:50:36 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:57.466 13:50:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:57.466 [2024-07-15 13:50:36.755083] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:57.466 [2024-07-15 13:50:36.755154] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2257708 ] 00:31:57.466 [2024-07-15 13:50:36.883285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:57.723 [2024-07-15 13:50:36.989487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:58.652 13:50:37 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:58.652 13:50:37 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:31:58.652 13:50:37 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:58.652 13:50:37 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:31:58.652 13:50:37 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:31:58.652 13:50:37 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.652 13:50:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.909 Malloc0 00:31:58.909 Malloc1 00:31:58.909 true 00:31:58.909 true 00:31:58.909 true 00:31:58.909 [2024-07-15 13:50:38.200404] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:58.909 crypto_ram 00:31:58.909 [2024-07-15 13:50:38.208431] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:58.909 crypto_ram2 00:31:58.909 [2024-07-15 13:50:38.216459] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:58.909 crypto_ram3 00:31:58.909 [ 00:31:58.909 { 00:31:58.909 "name": "Malloc1", 00:31:58.909 "aliases": [ 00:31:58.909 "f93afb0c-2ebc-41a9-afc6-c4e69037ad04" 00:31:58.909 ], 00:31:58.909 "product_name": "Malloc disk", 00:31:58.909 "block_size": 4096, 00:31:58.909 "num_blocks": 4096, 00:31:58.909 "uuid": "f93afb0c-2ebc-41a9-afc6-c4e69037ad04", 00:31:58.909 "assigned_rate_limits": { 00:31:58.909 "rw_ios_per_sec": 0, 00:31:58.909 "rw_mbytes_per_sec": 0, 00:31:58.909 "r_mbytes_per_sec": 0, 00:31:58.909 "w_mbytes_per_sec": 0 00:31:58.909 }, 00:31:58.910 "claimed": true, 00:31:58.910 "claim_type": "exclusive_write", 00:31:58.910 "zoned": false, 00:31:58.910 "supported_io_types": { 00:31:58.910 "read": true, 00:31:58.910 "write": true, 00:31:58.910 "unmap": true, 00:31:58.910 "flush": true, 00:31:58.910 "reset": true, 00:31:58.910 "nvme_admin": false, 00:31:58.910 "nvme_io": false, 00:31:58.910 "nvme_io_md": false, 00:31:58.910 "write_zeroes": true, 00:31:58.910 "zcopy": true, 00:31:58.910 "get_zone_info": false, 00:31:58.910 "zone_management": false, 00:31:58.910 "zone_append": false, 00:31:58.910 "compare": false, 00:31:58.910 "compare_and_write": false, 00:31:58.910 "abort": true, 00:31:58.910 "seek_hole": false, 00:31:58.910 "seek_data": false, 00:31:58.910 "copy": true, 00:31:58.910 "nvme_iov_md": false 00:31:58.910 }, 00:31:58.910 "memory_domains": [ 00:31:58.910 { 00:31:58.910 "dma_device_id": "system", 00:31:58.910 "dma_device_type": 1 00:31:58.910 }, 00:31:58.910 { 00:31:58.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:58.910 "dma_device_type": 2 00:31:58.910 } 00:31:58.910 ], 00:31:58.910 "driver_specific": {} 00:31:58.910 } 00:31:58.910 ] 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:58.910 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:58.910 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3d52bc2-f32d-58b3-add2-7a7453bfb01c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "c3d52bc2-f32d-58b3-add2-7a7453bfb01c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "96dbb986-83a3-5568-b1d2-9d788575a9a4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "96dbb986-83a3-5568-b1d2-9d788575a9a4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:59.167 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2257708 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2257708 ']' 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2257708 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2257708 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2257708' 00:31:59.167 killing process with pid 2257708 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2257708 00:31:59.167 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2257708 00:31:59.730 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:59.730 13:50:38 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:59.730 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:59.730 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:59.730 13:50:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:59.730 ************************************ 00:31:59.730 START TEST bdev_hello_world 00:31:59.730 ************************************ 00:31:59.730 13:50:38 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:59.730 [2024-07-15 13:50:38.972953] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:31:59.730 [2024-07-15 13:50:38.973013] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258064 ] 00:31:59.730 [2024-07-15 13:50:39.099469] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:59.987 [2024-07-15 13:50:39.200730] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:59.987 [2024-07-15 13:50:39.371839] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:59.987 [2024-07-15 13:50:39.371908] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:59.987 [2024-07-15 13:50:39.371924] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.987 [2024-07-15 13:50:39.379859] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:59.987 [2024-07-15 13:50:39.379878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:59.987 [2024-07-15 13:50:39.379890] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.987 [2024-07-15 13:50:39.387879] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:59.987 [2024-07-15 13:50:39.387897] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:59.987 [2024-07-15 13:50:39.387909] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.244 [2024-07-15 13:50:39.428304] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:00.244 [2024-07-15 13:50:39.428338] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:00.244 [2024-07-15 13:50:39.428357] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:00.244 [2024-07-15 13:50:39.430343] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:00.244 [2024-07-15 13:50:39.430410] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:00.244 [2024-07-15 13:50:39.430425] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:00.244 [2024-07-15 13:50:39.430459] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:00.244 00:32:00.244 [2024-07-15 13:50:39.430476] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:00.244 00:32:00.244 real 0m0.720s 00:32:00.244 user 0m0.484s 00:32:00.244 sys 0m0.221s 00:32:00.244 13:50:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:00.244 13:50:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:00.244 ************************************ 00:32:00.244 END TEST bdev_hello_world 00:32:00.244 ************************************ 00:32:00.536 13:50:39 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:00.536 13:50:39 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:00.536 13:50:39 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:00.536 13:50:39 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:00.536 13:50:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:00.536 ************************************ 00:32:00.536 START TEST bdev_bounds 00:32:00.536 ************************************ 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2258196 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2258196' 00:32:00.536 Process bdevio pid: 2258196 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2258196 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2258196 ']' 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:00.536 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:00.536 13:50:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:00.536 [2024-07-15 13:50:39.771069] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:00.536 [2024-07-15 13:50:39.771134] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2258196 ] 00:32:00.536 [2024-07-15 13:50:39.889037] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:00.799 [2024-07-15 13:50:39.996317] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:00.799 [2024-07-15 13:50:39.996401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:00.799 [2024-07-15 13:50:39.996405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:00.799 [2024-07-15 13:50:40.161162] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:00.799 [2024-07-15 13:50:40.161225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:00.799 [2024-07-15 13:50:40.161240] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.799 [2024-07-15 13:50:40.169186] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:00.799 [2024-07-15 13:50:40.169206] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:00.799 [2024-07-15 13:50:40.169218] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.799 [2024-07-15 13:50:40.177211] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:00.799 [2024-07-15 13:50:40.177229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:00.799 [2024-07-15 13:50:40.177240] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:01.363 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:01.363 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:01.363 13:50:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:01.621 I/O targets: 00:32:01.621 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:32:01.621 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:32:01.621 00:32:01.621 00:32:01.621 CUnit - A unit testing framework for C - Version 2.1-3 00:32:01.621 http://cunit.sourceforge.net/ 00:32:01.621 00:32:01.621 00:32:01.621 Suite: bdevio tests on: crypto_ram3 00:32:01.621 Test: blockdev write read block ...passed 00:32:01.621 Test: blockdev write zeroes read block ...passed 00:32:01.621 Test: blockdev write zeroes read no split ...passed 00:32:01.621 Test: blockdev write zeroes read split ...passed 00:32:01.621 Test: blockdev write zeroes read split partial ...passed 00:32:01.621 Test: blockdev reset ...passed 00:32:01.621 Test: blockdev write read 8 blocks ...passed 00:32:01.621 Test: blockdev write read size > 128k ...passed 00:32:01.621 Test: blockdev write read invalid size ...passed 00:32:01.621 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:01.621 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:01.621 Test: blockdev write read max offset ...passed 00:32:01.621 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:01.621 Test: blockdev writev readv 8 blocks ...passed 00:32:01.621 Test: blockdev writev readv 30 x 1block ...passed 00:32:01.621 Test: blockdev writev readv block ...passed 00:32:01.621 Test: blockdev writev readv size > 128k ...passed 00:32:01.621 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:01.621 Test: blockdev comparev and writev ...passed 00:32:01.621 Test: blockdev nvme passthru rw ...passed 00:32:01.621 Test: blockdev nvme passthru vendor specific ...passed 00:32:01.621 Test: blockdev nvme admin passthru ...passed 00:32:01.621 Test: blockdev copy ...passed 00:32:01.621 Suite: bdevio tests on: crypto_ram 00:32:01.621 Test: blockdev write read block ...passed 00:32:01.621 Test: blockdev write zeroes read block ...passed 00:32:01.621 Test: blockdev write zeroes read no split ...passed 00:32:01.621 Test: blockdev write zeroes read split ...passed 00:32:01.621 Test: blockdev write zeroes read split partial ...passed 00:32:01.621 Test: blockdev reset ...passed 00:32:01.621 Test: blockdev write read 8 blocks ...passed 00:32:01.621 Test: blockdev write read size > 128k ...passed 00:32:01.621 Test: blockdev write read invalid size ...passed 00:32:01.621 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:01.621 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:01.621 Test: blockdev write read max offset ...passed 00:32:01.621 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:01.621 Test: blockdev writev readv 8 blocks ...passed 00:32:01.621 Test: blockdev writev readv 30 x 1block ...passed 00:32:01.621 Test: blockdev writev readv block ...passed 00:32:01.621 Test: blockdev writev readv size > 128k ...passed 00:32:01.621 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:01.621 Test: blockdev comparev and writev ...passed 00:32:01.621 Test: blockdev nvme passthru rw ...passed 00:32:01.621 Test: blockdev nvme passthru vendor specific ...passed 00:32:01.621 Test: blockdev nvme admin passthru ...passed 00:32:01.621 Test: blockdev copy ...passed 00:32:01.621 00:32:01.621 Run Summary: Type Total Ran Passed Failed Inactive 00:32:01.621 suites 2 2 n/a 0 0 00:32:01.621 tests 46 46 46 0 0 00:32:01.621 asserts 260 260 260 0 n/a 00:32:01.621 00:32:01.621 Elapsed time = 0.082 seconds 00:32:01.621 0 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2258196 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2258196 ']' 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2258196 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2258196 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:01.621 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2258196' 00:32:01.621 killing process with pid 2258196 00:32:01.622 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2258196 00:32:01.622 13:50:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2258196 00:32:01.879 13:50:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:01.879 00:32:01.879 real 0m1.447s 00:32:01.879 user 0m3.779s 00:32:01.879 sys 0m0.390s 00:32:01.879 13:50:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:01.879 13:50:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:01.879 ************************************ 00:32:01.879 END TEST bdev_bounds 00:32:01.879 ************************************ 00:32:01.879 13:50:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:01.880 13:50:41 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:01.880 13:50:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:01.880 13:50:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:01.880 13:50:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:01.880 ************************************ 00:32:01.880 START TEST bdev_nbd 00:32:01.880 ************************************ 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2258463 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2258463 /var/tmp/spdk-nbd.sock 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2258463 ']' 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:01.880 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:01.880 13:50:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:01.880 [2024-07-15 13:50:41.301409] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:01.880 [2024-07-15 13:50:41.301474] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:02.138 [2024-07-15 13:50:41.429193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:02.138 [2024-07-15 13:50:41.531937] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:02.395 [2024-07-15 13:50:41.700807] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:02.395 [2024-07-15 13:50:41.700877] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:02.395 [2024-07-15 13:50:41.700892] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.395 [2024-07-15 13:50:41.708825] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:02.395 [2024-07-15 13:50:41.708844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:02.395 [2024-07-15 13:50:41.708855] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.395 [2024-07-15 13:50:41.716848] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:02.395 [2024-07-15 13:50:41.716870] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:02.395 [2024-07-15 13:50:41.716882] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:02.962 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:03.220 1+0 records in 00:32:03.220 1+0 records out 00:32:03.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269181 s, 15.2 MB/s 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:03.220 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:03.478 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:03.479 1+0 records in 00:32:03.479 1+0 records out 00:32:03.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345431 s, 11.9 MB/s 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:03.479 13:50:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:03.736 { 00:32:03.736 "nbd_device": "/dev/nbd0", 00:32:03.736 "bdev_name": "crypto_ram" 00:32:03.736 }, 00:32:03.736 { 00:32:03.736 "nbd_device": "/dev/nbd1", 00:32:03.736 "bdev_name": "crypto_ram3" 00:32:03.736 } 00:32:03.736 ]' 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:03.736 { 00:32:03.736 "nbd_device": "/dev/nbd0", 00:32:03.736 "bdev_name": "crypto_ram" 00:32:03.736 }, 00:32:03.736 { 00:32:03.736 "nbd_device": "/dev/nbd1", 00:32:03.736 "bdev_name": "crypto_ram3" 00:32:03.736 } 00:32:03.736 ]' 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:03.736 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:03.737 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:03.737 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:03.737 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:03.737 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:03.737 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:04.004 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:04.270 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:04.527 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:04.527 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:04.527 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:04.527 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:04.528 13:50:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:05.093 /dev/nbd0 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:05.093 1+0 records in 00:32:05.093 1+0 records out 00:32:05.093 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000158923 s, 25.8 MB/s 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:05.093 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:32:05.351 /dev/nbd1 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:05.351 1+0 records in 00:32:05.351 1+0 records out 00:32:05.351 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345937 s, 11.8 MB/s 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:05.351 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:05.609 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:05.609 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:05.609 { 00:32:05.609 "nbd_device": "/dev/nbd0", 00:32:05.609 "bdev_name": "crypto_ram" 00:32:05.609 }, 00:32:05.609 { 00:32:05.609 "nbd_device": "/dev/nbd1", 00:32:05.609 "bdev_name": "crypto_ram3" 00:32:05.609 } 00:32:05.609 ]' 00:32:05.609 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:05.609 { 00:32:05.609 "nbd_device": "/dev/nbd0", 00:32:05.609 "bdev_name": "crypto_ram" 00:32:05.609 }, 00:32:05.609 { 00:32:05.609 "nbd_device": "/dev/nbd1", 00:32:05.609 "bdev_name": "crypto_ram3" 00:32:05.609 } 00:32:05.609 ]' 00:32:05.609 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:05.609 13:50:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:05.609 /dev/nbd1' 00:32:05.609 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:05.610 /dev/nbd1' 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:05.610 256+0 records in 00:32:05.610 256+0 records out 00:32:05.610 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011028 s, 95.1 MB/s 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:05.610 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:05.868 256+0 records in 00:32:05.868 256+0 records out 00:32:05.868 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196301 s, 53.4 MB/s 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:05.868 256+0 records in 00:32:05.868 256+0 records out 00:32:05.868 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0351483 s, 29.8 MB/s 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:05.868 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:06.126 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:06.384 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:06.642 13:50:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:06.900 malloc_lvol_verify 00:32:06.901 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:07.164 7abf4fc8-4cfb-4fa8-87ce-8559875bd8fa 00:32:07.164 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:07.423 81528e3a-7f28-4881-b599-37c25462fba6 00:32:07.423 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:07.681 /dev/nbd0 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:07.681 mke2fs 1.46.5 (30-Dec-2021) 00:32:07.681 Discarding device blocks: 0/4096 done 00:32:07.681 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:07.681 00:32:07.681 Allocating group tables: 0/1 done 00:32:07.681 Writing inode tables: 0/1 done 00:32:07.681 Creating journal (1024 blocks): done 00:32:07.681 Writing superblocks and filesystem accounting information: 0/1 done 00:32:07.681 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:07.681 13:50:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2258463 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2258463 ']' 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2258463 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2258463 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2258463' 00:32:07.939 killing process with pid 2258463 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2258463 00:32:07.939 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2258463 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:08.197 00:32:08.197 real 0m6.249s 00:32:08.197 user 0m9.161s 00:32:08.197 sys 0m2.388s 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:08.197 ************************************ 00:32:08.197 END TEST bdev_nbd 00:32:08.197 ************************************ 00:32:08.197 13:50:47 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:08.197 13:50:47 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:08.197 13:50:47 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:32:08.197 13:50:47 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:32:08.197 13:50:47 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:08.197 13:50:47 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:08.197 13:50:47 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:08.197 13:50:47 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:08.197 ************************************ 00:32:08.197 START TEST bdev_fio 00:32:08.197 ************************************ 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:08.197 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:08.197 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:08.456 ************************************ 00:32:08.456 START TEST bdev_fio_rw_verify 00:32:08.456 ************************************ 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:08.456 13:50:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.714 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:08.714 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:08.714 fio-3.35 00:32:08.714 Starting 2 threads 00:32:20.906 00:32:20.906 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2259572: Mon Jul 15 13:50:58 2024 00:32:20.906 read: IOPS=21.8k, BW=85.3MiB/s (89.4MB/s)(853MiB/10001msec) 00:32:20.906 slat (nsec): min=14150, max=70409, avg=19812.09, stdev=3466.00 00:32:20.906 clat (usec): min=7, max=461, avg=145.11, stdev=57.69 00:32:20.906 lat (usec): min=25, max=478, avg=164.92, stdev=59.07 00:32:20.906 clat percentiles (usec): 00:32:20.906 | 50.000th=[ 143], 99.000th=[ 277], 99.900th=[ 297], 99.990th=[ 334], 00:32:20.906 | 99.999th=[ 412] 00:32:20.906 write: IOPS=26.3k, BW=103MiB/s (108MB/s)(973MiB/9484msec); 0 zone resets 00:32:20.906 slat (usec): min=14, max=399, avg=33.71, stdev= 4.33 00:32:20.906 clat (usec): min=15, max=1960, avg=195.69, stdev=89.64 00:32:20.906 lat (usec): min=48, max=1994, avg=229.40, stdev=91.25 00:32:20.906 clat percentiles (usec): 00:32:20.906 | 50.000th=[ 190], 99.000th=[ 388], 99.900th=[ 408], 99.990th=[ 644], 00:32:20.906 | 99.999th=[ 1893] 00:32:20.906 bw ( KiB/s): min=95512, max=105720, per=94.91%, avg=99713.26, stdev=1573.61, samples=38 00:32:20.906 iops : min=23878, max=26430, avg=24928.32, stdev=393.40, samples=38 00:32:20.906 lat (usec) : 10=0.01%, 20=0.01%, 50=4.61%, 100=14.90%, 250=63.34% 00:32:20.906 lat (usec) : 500=17.13%, 750=0.02%, 1000=0.01% 00:32:20.906 lat (msec) : 2=0.01% 00:32:20.906 cpu : usr=99.59%, sys=0.01%, ctx=21, majf=0, minf=458 00:32:20.906 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:20.906 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:20.906 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:20.906 issued rwts: total=218341,249092,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:20.906 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:20.906 00:32:20.906 Run status group 0 (all jobs): 00:32:20.906 READ: bw=85.3MiB/s (89.4MB/s), 85.3MiB/s-85.3MiB/s (89.4MB/s-89.4MB/s), io=853MiB (894MB), run=10001-10001msec 00:32:20.906 WRITE: bw=103MiB/s (108MB/s), 103MiB/s-103MiB/s (108MB/s-108MB/s), io=973MiB (1020MB), run=9484-9484msec 00:32:20.906 00:32:20.906 real 0m11.121s 00:32:20.906 user 0m23.738s 00:32:20.906 sys 0m0.352s 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:20.906 ************************************ 00:32:20.906 END TEST bdev_fio_rw_verify 00:32:20.906 ************************************ 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:20.906 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3d52bc2-f32d-58b3-add2-7a7453bfb01c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "c3d52bc2-f32d-58b3-add2-7a7453bfb01c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "96dbb986-83a3-5568-b1d2-9d788575a9a4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "96dbb986-83a3-5568-b1d2-9d788575a9a4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:20.907 crypto_ram3 ]] 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c3d52bc2-f32d-58b3-add2-7a7453bfb01c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "c3d52bc2-f32d-58b3-add2-7a7453bfb01c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "96dbb986-83a3-5568-b1d2-9d788575a9a4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "96dbb986-83a3-5568-b1d2-9d788575a9a4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:20.907 ************************************ 00:32:20.907 START TEST bdev_fio_trim 00:32:20.907 ************************************ 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:20.907 13:50:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:20.907 13:50:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:20.907 13:50:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:20.907 13:50:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:20.907 13:50:59 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:20.907 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:20.907 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:20.907 fio-3.35 00:32:20.907 Starting 2 threads 00:32:30.876 00:32:30.876 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2261087: Mon Jul 15 13:51:09 2024 00:32:30.876 write: IOPS=40.0k, BW=156MiB/s (164MB/s)(1562MiB/10001msec); 0 zone resets 00:32:30.876 slat (usec): min=14, max=493, avg=22.07, stdev= 4.39 00:32:30.876 clat (usec): min=23, max=1814, avg=164.20, stdev=90.52 00:32:30.876 lat (usec): min=40, max=1832, avg=186.27, stdev=93.84 00:32:30.876 clat percentiles (usec): 00:32:30.876 | 50.000th=[ 133], 99.000th=[ 338], 99.900th=[ 363], 99.990th=[ 437], 00:32:30.876 | 99.999th=[ 1729] 00:32:30.876 bw ( KiB/s): min=156904, max=163936, per=100.00%, avg=159952.42, stdev=931.70, samples=38 00:32:30.876 iops : min=39226, max=40984, avg=39988.11, stdev=232.92, samples=38 00:32:30.876 trim: IOPS=40.0k, BW=156MiB/s (164MB/s)(1562MiB/10001msec); 0 zone resets 00:32:30.876 slat (nsec): min=5983, max=54132, avg=9966.59, stdev=2241.52 00:32:30.876 clat (usec): min=41, max=1832, avg=109.62, stdev=32.87 00:32:30.876 lat (usec): min=49, max=1839, avg=119.59, stdev=32.96 00:32:30.876 clat percentiles (usec): 00:32:30.876 | 50.000th=[ 110], 99.000th=[ 178], 99.900th=[ 190], 99.990th=[ 223], 00:32:30.876 | 99.999th=[ 1745] 00:32:30.876 bw ( KiB/s): min=156928, max=163936, per=100.00%, avg=159954.11, stdev=930.85, samples=38 00:32:30.876 iops : min=39232, max=40984, avg=39988.53, stdev=232.71, samples=38 00:32:30.876 lat (usec) : 50=4.11%, 100=33.02%, 250=49.94%, 500=12.92%, 750=0.01% 00:32:30.876 lat (usec) : 1000=0.01% 00:32:30.876 lat (msec) : 2=0.01% 00:32:30.876 cpu : usr=99.55%, sys=0.01%, ctx=31, majf=0, minf=345 00:32:30.876 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:30.876 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:30.876 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:30.876 issued rwts: total=0,399856,399857,0 short=0,0,0,0 dropped=0,0,0,0 00:32:30.876 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:30.876 00:32:30.876 Run status group 0 (all jobs): 00:32:30.876 WRITE: bw=156MiB/s (164MB/s), 156MiB/s-156MiB/s (164MB/s-164MB/s), io=1562MiB (1638MB), run=10001-10001msec 00:32:30.876 TRIM: bw=156MiB/s (164MB/s), 156MiB/s-156MiB/s (164MB/s-164MB/s), io=1562MiB (1638MB), run=10001-10001msec 00:32:30.876 00:32:30.876 real 0m11.075s 00:32:30.876 user 0m23.584s 00:32:30.876 sys 0m0.325s 00:32:30.876 13:51:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:30.876 13:51:10 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:30.876 ************************************ 00:32:30.876 END TEST bdev_fio_trim 00:32:30.877 ************************************ 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:30.877 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:30.877 00:32:30.877 real 0m22.522s 00:32:30.877 user 0m47.501s 00:32:30.877 sys 0m0.845s 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:30.877 ************************************ 00:32:30.877 END TEST bdev_fio 00:32:30.877 ************************************ 00:32:30.877 13:51:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:30.877 13:51:10 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:30.877 13:51:10 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:30.877 13:51:10 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:30.877 13:51:10 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:30.877 13:51:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:30.877 ************************************ 00:32:30.877 START TEST bdev_verify 00:32:30.877 ************************************ 00:32:30.877 13:51:10 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:30.877 [2024-07-15 13:51:10.223874] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:30.877 [2024-07-15 13:51:10.223944] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2262895 ] 00:32:31.191 [2024-07-15 13:51:10.352152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:31.191 [2024-07-15 13:51:10.451367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:31.191 [2024-07-15 13:51:10.451372] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:31.448 [2024-07-15 13:51:10.639513] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:31.448 [2024-07-15 13:51:10.639578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:31.448 [2024-07-15 13:51:10.639593] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:31.448 [2024-07-15 13:51:10.647533] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:31.448 [2024-07-15 13:51:10.647553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:31.448 [2024-07-15 13:51:10.647565] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:31.448 [2024-07-15 13:51:10.655559] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:31.448 [2024-07-15 13:51:10.655578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:31.448 [2024-07-15 13:51:10.655589] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:31.448 Running I/O for 5 seconds... 00:32:36.707 00:32:36.707 Latency(us) 00:32:36.707 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:36.707 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:36.707 Verification LBA range: start 0x0 length 0x800 00:32:36.707 crypto_ram : 5.02 6095.89 23.81 0.00 0.00 20909.70 1695.39 25758.50 00:32:36.707 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:36.707 Verification LBA range: start 0x800 length 0x800 00:32:36.707 crypto_ram : 5.01 6128.38 23.94 0.00 0.00 20801.71 1659.77 25644.52 00:32:36.707 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:36.707 Verification LBA range: start 0x0 length 0x800 00:32:36.707 crypto_ram3 : 5.03 3056.12 11.94 0.00 0.00 41637.08 2108.55 30089.57 00:32:36.707 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:36.707 Verification LBA range: start 0x800 length 0x800 00:32:36.707 crypto_ram3 : 5.03 3080.29 12.03 0.00 0.00 41308.18 1852.10 30545.47 00:32:36.707 =================================================================================================================== 00:32:36.707 Total : 18360.67 71.72 0.00 0.00 27756.34 1659.77 30545.47 00:32:36.707 00:32:36.707 real 0m5.805s 00:32:36.707 user 0m10.901s 00:32:36.707 sys 0m0.243s 00:32:36.707 13:51:15 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:36.707 13:51:15 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:36.707 ************************************ 00:32:36.707 END TEST bdev_verify 00:32:36.707 ************************************ 00:32:36.707 13:51:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:36.707 13:51:16 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:36.707 13:51:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:36.707 13:51:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:36.707 13:51:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:36.707 ************************************ 00:32:36.707 START TEST bdev_verify_big_io 00:32:36.707 ************************************ 00:32:36.707 13:51:16 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:36.707 [2024-07-15 13:51:16.091746] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:36.708 [2024-07-15 13:51:16.091804] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2263730 ] 00:32:36.965 [2024-07-15 13:51:16.220652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:36.965 [2024-07-15 13:51:16.318348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:36.965 [2024-07-15 13:51:16.318354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:37.222 [2024-07-15 13:51:16.499293] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:37.222 [2024-07-15 13:51:16.499368] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:37.222 [2024-07-15 13:51:16.499384] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.222 [2024-07-15 13:51:16.507315] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:37.222 [2024-07-15 13:51:16.507335] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:37.222 [2024-07-15 13:51:16.507347] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.222 [2024-07-15 13:51:16.515337] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:37.222 [2024-07-15 13:51:16.515356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:37.222 [2024-07-15 13:51:16.515367] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.222 Running I/O for 5 seconds... 00:32:42.506 00:32:42.506 Latency(us) 00:32:42.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:42.506 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:42.506 Verification LBA range: start 0x0 length 0x80 00:32:42.506 crypto_ram : 5.31 458.42 28.65 0.00 0.00 272820.80 6753.06 379310.97 00:32:42.506 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:42.506 Verification LBA range: start 0x80 length 0x80 00:32:42.506 crypto_ram : 5.13 474.22 29.64 0.00 0.00 263765.29 6183.18 373840.14 00:32:42.506 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:42.506 Verification LBA range: start 0x0 length 0x80 00:32:42.506 crypto_ram3 : 5.32 240.60 15.04 0.00 0.00 500461.60 6012.22 379310.97 00:32:42.506 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:42.506 Verification LBA range: start 0x80 length 0x80 00:32:42.506 crypto_ram3 : 5.32 264.80 16.55 0.00 0.00 454942.09 6183.18 375663.75 00:32:42.506 =================================================================================================================== 00:32:42.506 Total : 1438.04 89.88 0.00 0.00 342442.62 6012.22 379310.97 00:32:42.764 00:32:42.764 real 0m6.089s 00:32:42.764 user 0m11.486s 00:32:42.764 sys 0m0.241s 00:32:42.764 13:51:22 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:42.764 13:51:22 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:42.764 ************************************ 00:32:42.764 END TEST bdev_verify_big_io 00:32:42.764 ************************************ 00:32:42.764 13:51:22 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:42.764 13:51:22 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:42.764 13:51:22 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:42.764 13:51:22 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:42.764 13:51:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:43.022 ************************************ 00:32:43.022 START TEST bdev_write_zeroes 00:32:43.022 ************************************ 00:32:43.022 13:51:22 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:43.022 [2024-07-15 13:51:22.275498] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:43.022 [2024-07-15 13:51:22.275558] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264452 ] 00:32:43.022 [2024-07-15 13:51:22.404220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:43.281 [2024-07-15 13:51:22.505609] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:43.281 [2024-07-15 13:51:22.687434] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:43.281 [2024-07-15 13:51:22.687500] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:43.281 [2024-07-15 13:51:22.687516] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:43.281 [2024-07-15 13:51:22.695451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:43.281 [2024-07-15 13:51:22.695472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:43.281 [2024-07-15 13:51:22.695484] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:43.281 [2024-07-15 13:51:22.703472] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:43.281 [2024-07-15 13:51:22.703492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:43.281 [2024-07-15 13:51:22.703503] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:43.539 Running I/O for 1 seconds... 00:32:44.473 00:32:44.473 Latency(us) 00:32:44.473 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:44.473 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:44.473 crypto_ram : 1.01 26669.95 104.18 0.00 0.00 4787.46 1289.35 6439.62 00:32:44.473 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:44.473 crypto_ram3 : 1.01 13307.98 51.98 0.00 0.00 9554.34 5955.23 9687.93 00:32:44.473 =================================================================================================================== 00:32:44.473 Total : 39977.93 156.16 0.00 0.00 6376.42 1289.35 9687.93 00:32:44.732 00:32:44.732 real 0m1.787s 00:32:44.732 user 0m1.520s 00:32:44.732 sys 0m0.242s 00:32:44.732 13:51:24 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:44.732 13:51:24 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:44.732 ************************************ 00:32:44.732 END TEST bdev_write_zeroes 00:32:44.732 ************************************ 00:32:44.732 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:44.732 13:51:24 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:44.732 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:44.732 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:44.732 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:44.732 ************************************ 00:32:44.732 START TEST bdev_json_nonenclosed 00:32:44.732 ************************************ 00:32:44.732 13:51:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:44.732 [2024-07-15 13:51:24.135238] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:44.732 [2024-07-15 13:51:24.135310] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264684 ] 00:32:44.995 [2024-07-15 13:51:24.263004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:44.995 [2024-07-15 13:51:24.372181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:44.995 [2024-07-15 13:51:24.372255] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:44.995 [2024-07-15 13:51:24.372276] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:44.995 [2024-07-15 13:51:24.372289] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:45.255 00:32:45.255 real 0m0.408s 00:32:45.255 user 0m0.254s 00:32:45.255 sys 0m0.151s 00:32:45.255 13:51:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:45.255 13:51:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.255 13:51:24 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:45.255 ************************************ 00:32:45.255 END TEST bdev_json_nonenclosed 00:32:45.255 ************************************ 00:32:45.255 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:32:45.255 13:51:24 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:32:45.255 13:51:24 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:45.255 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:45.255 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:45.255 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:45.255 ************************************ 00:32:45.255 START TEST bdev_json_nonarray 00:32:45.255 ************************************ 00:32:45.255 13:51:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:45.255 [2024-07-15 13:51:24.624028] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:45.255 [2024-07-15 13:51:24.624089] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264835 ] 00:32:45.513 [2024-07-15 13:51:24.753414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:45.513 [2024-07-15 13:51:24.853777] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:45.513 [2024-07-15 13:51:24.853856] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:45.513 [2024-07-15 13:51:24.853877] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:45.513 [2024-07-15 13:51:24.853889] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:45.771 00:32:45.771 real 0m0.390s 00:32:45.771 user 0m0.233s 00:32:45.771 sys 0m0.155s 00:32:45.771 13:51:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:32:45.771 13:51:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.771 13:51:24 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:45.771 ************************************ 00:32:45.771 END TEST bdev_json_nonarray 00:32:45.771 ************************************ 00:32:45.771 13:51:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:32:45.771 13:51:24 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:32:45.771 13:51:25 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:32:45.771 13:51:25 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:32:45.771 13:51:25 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:32:45.771 13:51:25 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:32:45.771 13:51:25 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:32:45.771 13:51:25 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:45.771 13:51:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:45.771 ************************************ 00:32:45.771 START TEST bdev_crypto_enomem 00:32:45.771 ************************************ 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2264862 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2264862 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2264862 ']' 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:45.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:45.771 13:51:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:45.771 [2024-07-15 13:51:25.101136] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:45.771 [2024-07-15 13:51:25.101204] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2264862 ] 00:32:46.030 [2024-07-15 13:51:25.219922] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:46.030 [2024-07-15 13:51:25.325113] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:46.962 true 00:32:46.962 base0 00:32:46.962 true 00:32:46.962 [2024-07-15 13:51:26.060080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:46.962 crypt0 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:46.962 [ 00:32:46.962 { 00:32:46.962 "name": "crypt0", 00:32:46.962 "aliases": [ 00:32:46.962 "33dd3543-0fe5-5912-9edf-3eab72daa84a" 00:32:46.962 ], 00:32:46.962 "product_name": "crypto", 00:32:46.962 "block_size": 512, 00:32:46.962 "num_blocks": 2097152, 00:32:46.962 "uuid": "33dd3543-0fe5-5912-9edf-3eab72daa84a", 00:32:46.962 "assigned_rate_limits": { 00:32:46.962 "rw_ios_per_sec": 0, 00:32:46.962 "rw_mbytes_per_sec": 0, 00:32:46.962 "r_mbytes_per_sec": 0, 00:32:46.962 "w_mbytes_per_sec": 0 00:32:46.962 }, 00:32:46.962 "claimed": false, 00:32:46.962 "zoned": false, 00:32:46.962 "supported_io_types": { 00:32:46.962 "read": true, 00:32:46.962 "write": true, 00:32:46.962 "unmap": false, 00:32:46.962 "flush": false, 00:32:46.962 "reset": true, 00:32:46.962 "nvme_admin": false, 00:32:46.962 "nvme_io": false, 00:32:46.962 "nvme_io_md": false, 00:32:46.962 "write_zeroes": true, 00:32:46.962 "zcopy": false, 00:32:46.962 "get_zone_info": false, 00:32:46.962 "zone_management": false, 00:32:46.962 "zone_append": false, 00:32:46.962 "compare": false, 00:32:46.962 "compare_and_write": false, 00:32:46.962 "abort": false, 00:32:46.962 "seek_hole": false, 00:32:46.962 "seek_data": false, 00:32:46.962 "copy": false, 00:32:46.962 "nvme_iov_md": false 00:32:46.962 }, 00:32:46.962 "memory_domains": [ 00:32:46.962 { 00:32:46.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:46.962 "dma_device_type": 2 00:32:46.962 } 00:32:46.962 ], 00:32:46.962 "driver_specific": { 00:32:46.962 "crypto": { 00:32:46.962 "base_bdev_name": "EE_base0", 00:32:46.962 "name": "crypt0", 00:32:46.962 "key_name": "test_dek_sw" 00:32:46.962 } 00:32:46.962 } 00:32:46.962 } 00:32:46.962 ] 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2265039 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:32:46.962 13:51:26 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:46.962 Running I/O for 5 seconds... 00:32:47.893 13:51:27 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:32:47.893 13:51:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:47.893 13:51:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:47.893 13:51:27 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:47.893 13:51:27 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2265039 00:32:52.073 00:32:52.073 Latency(us) 00:32:52.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:52.073 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:32:52.073 crypt0 : 5.00 36240.16 141.56 0.00 0.00 879.34 420.29 1745.25 00:32:52.073 =================================================================================================================== 00:32:52.073 Total : 36240.16 141.56 0.00 0.00 879.34 420.29 1745.25 00:32:52.073 0 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2264862 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2264862 ']' 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2264862 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2264862 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2264862' 00:32:52.073 killing process with pid 2264862 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2264862 00:32:52.073 Received shutdown signal, test time was about 5.000000 seconds 00:32:52.073 00:32:52.073 Latency(us) 00:32:52.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:52.073 =================================================================================================================== 00:32:52.073 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:52.073 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2264862 00:32:52.332 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:32:52.332 00:32:52.332 real 0m6.469s 00:32:52.332 user 0m6.719s 00:32:52.332 sys 0m0.392s 00:32:52.332 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:52.332 13:51:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:52.332 ************************************ 00:32:52.332 END TEST bdev_crypto_enomem 00:32:52.332 ************************************ 00:32:52.332 13:51:31 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:32:52.332 13:51:31 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:32:52.332 00:32:52.332 real 0m54.982s 00:32:52.332 user 1m34.968s 00:32:52.332 sys 0m6.464s 00:32:52.332 13:51:31 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:52.332 13:51:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:52.332 ************************************ 00:32:52.332 END TEST blockdev_crypto_sw 00:32:52.332 ************************************ 00:32:52.332 13:51:31 -- common/autotest_common.sh@1142 -- # return 0 00:32:52.332 13:51:31 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:52.332 13:51:31 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:52.332 13:51:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:52.332 13:51:31 -- common/autotest_common.sh@10 -- # set +x 00:32:52.332 ************************************ 00:32:52.332 START TEST blockdev_crypto_qat 00:32:52.332 ************************************ 00:32:52.332 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:52.332 * Looking for test storage... 00:32:52.332 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:52.332 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2265798 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:52.590 13:51:31 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2265798 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2265798 ']' 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:52.590 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:52.590 13:51:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:52.590 [2024-07-15 13:51:31.820494] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:52.590 [2024-07-15 13:51:31.820576] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2265798 ] 00:32:52.590 [2024-07-15 13:51:31.948597] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:52.848 [2024-07-15 13:51:32.052488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:53.456 13:51:32 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:53.456 13:51:32 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:32:53.456 13:51:32 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:53.456 13:51:32 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:32:53.456 13:51:32 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:32:53.456 13:51:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:53.456 13:51:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:53.456 [2024-07-15 13:51:32.766730] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:53.456 [2024-07-15 13:51:32.774766] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:53.456 [2024-07-15 13:51:32.782783] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:53.456 [2024-07-15 13:51:32.860158] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:55.984 true 00:32:55.984 true 00:32:55.984 true 00:32:55.984 true 00:32:55.984 Malloc0 00:32:55.984 Malloc1 00:32:55.984 Malloc2 00:32:55.984 Malloc3 00:32:55.984 [2024-07-15 13:51:35.245716] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:55.984 crypto_ram 00:32:55.984 [2024-07-15 13:51:35.253732] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:55.984 crypto_ram1 00:32:55.984 [2024-07-15 13:51:35.261755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:55.984 crypto_ram2 00:32:55.984 [2024-07-15 13:51:35.269778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:55.984 crypto_ram3 00:32:55.984 [ 00:32:55.984 { 00:32:55.984 "name": "Malloc1", 00:32:55.984 "aliases": [ 00:32:55.984 "e0bc8b8b-687c-4895-8275-6af63e2f81a0" 00:32:55.984 ], 00:32:55.984 "product_name": "Malloc disk", 00:32:55.984 "block_size": 512, 00:32:55.984 "num_blocks": 65536, 00:32:55.984 "uuid": "e0bc8b8b-687c-4895-8275-6af63e2f81a0", 00:32:55.984 "assigned_rate_limits": { 00:32:55.984 "rw_ios_per_sec": 0, 00:32:55.984 "rw_mbytes_per_sec": 0, 00:32:55.984 "r_mbytes_per_sec": 0, 00:32:55.984 "w_mbytes_per_sec": 0 00:32:55.984 }, 00:32:55.984 "claimed": true, 00:32:55.984 "claim_type": "exclusive_write", 00:32:55.984 "zoned": false, 00:32:55.984 "supported_io_types": { 00:32:55.984 "read": true, 00:32:55.984 "write": true, 00:32:55.984 "unmap": true, 00:32:55.984 "flush": true, 00:32:55.984 "reset": true, 00:32:55.984 "nvme_admin": false, 00:32:55.984 "nvme_io": false, 00:32:55.984 "nvme_io_md": false, 00:32:55.984 "write_zeroes": true, 00:32:55.984 "zcopy": true, 00:32:55.984 "get_zone_info": false, 00:32:55.984 "zone_management": false, 00:32:55.984 "zone_append": false, 00:32:55.984 "compare": false, 00:32:55.984 "compare_and_write": false, 00:32:55.984 "abort": true, 00:32:55.984 "seek_hole": false, 00:32:55.984 "seek_data": false, 00:32:55.984 "copy": true, 00:32:55.984 "nvme_iov_md": false 00:32:55.984 }, 00:32:55.984 "memory_domains": [ 00:32:55.984 { 00:32:55.984 "dma_device_id": "system", 00:32:55.984 "dma_device_type": 1 00:32:55.984 }, 00:32:55.984 { 00:32:55.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:55.984 "dma_device_type": 2 00:32:55.984 } 00:32:55.984 ], 00:32:55.984 "driver_specific": {} 00:32:55.984 } 00:32:55.984 ] 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:55.984 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:55.984 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b382ef69-2c7d-5bc6-9a92-f3a59e67da95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b382ef69-2c7d-5bc6-9a92-f3a59e67da95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "042d19ae-d8e6-52f2-b71a-d086b99e0dc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "042d19ae-d8e6-52f2-b71a-d086b99e0dc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d9f3d429-8e2c-5aa7-8b37-4e43135081ed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d9f3d429-8e2c-5aa7-8b37-4e43135081ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:56.243 13:51:35 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2265798 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2265798 ']' 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2265798 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2265798 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2265798' 00:32:56.243 killing process with pid 2265798 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2265798 00:32:56.243 13:51:35 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2265798 00:32:56.809 13:51:36 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:56.809 13:51:36 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:56.809 13:51:36 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:56.809 13:51:36 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:56.809 13:51:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:56.809 ************************************ 00:32:56.809 START TEST bdev_hello_world 00:32:56.809 ************************************ 00:32:56.809 13:51:36 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:56.809 [2024-07-15 13:51:36.189099] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:32:56.809 [2024-07-15 13:51:36.189160] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266365 ] 00:32:57.067 [2024-07-15 13:51:36.317997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:57.067 [2024-07-15 13:51:36.414119] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:57.067 [2024-07-15 13:51:36.435393] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:57.067 [2024-07-15 13:51:36.443422] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:57.067 [2024-07-15 13:51:36.451447] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:57.325 [2024-07-15 13:51:36.565146] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:59.855 [2024-07-15 13:51:38.777446] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:59.855 [2024-07-15 13:51:38.777522] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:59.855 [2024-07-15 13:51:38.777538] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.855 [2024-07-15 13:51:38.785465] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:59.855 [2024-07-15 13:51:38.785486] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:59.855 [2024-07-15 13:51:38.785498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.855 [2024-07-15 13:51:38.793486] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:59.855 [2024-07-15 13:51:38.793505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:59.855 [2024-07-15 13:51:38.793516] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.855 [2024-07-15 13:51:38.801521] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:59.855 [2024-07-15 13:51:38.801540] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:59.855 [2024-07-15 13:51:38.801552] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.855 [2024-07-15 13:51:38.878995] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:59.855 [2024-07-15 13:51:38.879040] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:59.855 [2024-07-15 13:51:38.879058] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:59.855 [2024-07-15 13:51:38.880331] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:59.855 [2024-07-15 13:51:38.880409] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:59.855 [2024-07-15 13:51:38.880426] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:59.855 [2024-07-15 13:51:38.880471] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:59.855 00:32:59.855 [2024-07-15 13:51:38.880491] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:00.113 00:33:00.113 real 0m3.164s 00:33:00.113 user 0m2.748s 00:33:00.113 sys 0m0.375s 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:00.113 ************************************ 00:33:00.113 END TEST bdev_hello_world 00:33:00.113 ************************************ 00:33:00.113 13:51:39 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:00.113 13:51:39 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:00.113 13:51:39 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:00.113 13:51:39 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.113 13:51:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:00.113 ************************************ 00:33:00.113 START TEST bdev_bounds 00:33:00.113 ************************************ 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2266776 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2266776' 00:33:00.113 Process bdevio pid: 2266776 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2266776 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2266776 ']' 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:00.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:00.113 13:51:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:00.113 [2024-07-15 13:51:39.433443] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:33:00.113 [2024-07-15 13:51:39.433509] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2266776 ] 00:33:00.371 [2024-07-15 13:51:39.560303] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:00.371 [2024-07-15 13:51:39.679948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:00.371 [2024-07-15 13:51:39.679971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:00.371 [2024-07-15 13:51:39.679977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:00.371 [2024-07-15 13:51:39.701389] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:00.371 [2024-07-15 13:51:39.709420] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:00.371 [2024-07-15 13:51:39.717441] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:00.629 [2024-07-15 13:51:39.823300] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:03.155 [2024-07-15 13:51:42.031470] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:03.155 [2024-07-15 13:51:42.031558] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:03.156 [2024-07-15 13:51:42.031576] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.156 [2024-07-15 13:51:42.039486] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:03.156 [2024-07-15 13:51:42.039512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:03.156 [2024-07-15 13:51:42.039526] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.156 [2024-07-15 13:51:42.047508] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:03.156 [2024-07-15 13:51:42.047526] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:03.156 [2024-07-15 13:51:42.047538] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.156 [2024-07-15 13:51:42.055532] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:03.156 [2024-07-15 13:51:42.055551] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:03.156 [2024-07-15 13:51:42.055562] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:03.156 I/O targets: 00:33:03.156 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:03.156 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:33:03.156 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:33:03.156 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:03.156 00:33:03.156 00:33:03.156 CUnit - A unit testing framework for C - Version 2.1-3 00:33:03.156 http://cunit.sourceforge.net/ 00:33:03.156 00:33:03.156 00:33:03.156 Suite: bdevio tests on: crypto_ram3 00:33:03.156 Test: blockdev write read block ...passed 00:33:03.156 Test: blockdev write zeroes read block ...passed 00:33:03.156 Test: blockdev write zeroes read no split ...passed 00:33:03.156 Test: blockdev write zeroes read split ...passed 00:33:03.156 Test: blockdev write zeroes read split partial ...passed 00:33:03.156 Test: blockdev reset ...passed 00:33:03.156 Test: blockdev write read 8 blocks ...passed 00:33:03.156 Test: blockdev write read size > 128k ...passed 00:33:03.156 Test: blockdev write read invalid size ...passed 00:33:03.156 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:03.156 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:03.156 Test: blockdev write read max offset ...passed 00:33:03.156 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:03.156 Test: blockdev writev readv 8 blocks ...passed 00:33:03.156 Test: blockdev writev readv 30 x 1block ...passed 00:33:03.156 Test: blockdev writev readv block ...passed 00:33:03.156 Test: blockdev writev readv size > 128k ...passed 00:33:03.156 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:03.156 Test: blockdev comparev and writev ...passed 00:33:03.156 Test: blockdev nvme passthru rw ...passed 00:33:03.156 Test: blockdev nvme passthru vendor specific ...passed 00:33:03.156 Test: blockdev nvme admin passthru ...passed 00:33:03.156 Test: blockdev copy ...passed 00:33:03.156 Suite: bdevio tests on: crypto_ram2 00:33:03.156 Test: blockdev write read block ...passed 00:33:03.156 Test: blockdev write zeroes read block ...passed 00:33:03.156 Test: blockdev write zeroes read no split ...passed 00:33:03.156 Test: blockdev write zeroes read split ...passed 00:33:03.156 Test: blockdev write zeroes read split partial ...passed 00:33:03.156 Test: blockdev reset ...passed 00:33:03.156 Test: blockdev write read 8 blocks ...passed 00:33:03.156 Test: blockdev write read size > 128k ...passed 00:33:03.156 Test: blockdev write read invalid size ...passed 00:33:03.156 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:03.156 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:03.156 Test: blockdev write read max offset ...passed 00:33:03.156 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:03.156 Test: blockdev writev readv 8 blocks ...passed 00:33:03.156 Test: blockdev writev readv 30 x 1block ...passed 00:33:03.156 Test: blockdev writev readv block ...passed 00:33:03.156 Test: blockdev writev readv size > 128k ...passed 00:33:03.156 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:03.156 Test: blockdev comparev and writev ...passed 00:33:03.156 Test: blockdev nvme passthru rw ...passed 00:33:03.156 Test: blockdev nvme passthru vendor specific ...passed 00:33:03.156 Test: blockdev nvme admin passthru ...passed 00:33:03.156 Test: blockdev copy ...passed 00:33:03.156 Suite: bdevio tests on: crypto_ram1 00:33:03.156 Test: blockdev write read block ...passed 00:33:03.156 Test: blockdev write zeroes read block ...passed 00:33:03.156 Test: blockdev write zeroes read no split ...passed 00:33:03.156 Test: blockdev write zeroes read split ...passed 00:33:03.156 Test: blockdev write zeroes read split partial ...passed 00:33:03.156 Test: blockdev reset ...passed 00:33:03.156 Test: blockdev write read 8 blocks ...passed 00:33:03.156 Test: blockdev write read size > 128k ...passed 00:33:03.156 Test: blockdev write read invalid size ...passed 00:33:03.156 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:03.156 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:03.156 Test: blockdev write read max offset ...passed 00:33:03.156 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:03.156 Test: blockdev writev readv 8 blocks ...passed 00:33:03.156 Test: blockdev writev readv 30 x 1block ...passed 00:33:03.156 Test: blockdev writev readv block ...passed 00:33:03.156 Test: blockdev writev readv size > 128k ...passed 00:33:03.156 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:03.156 Test: blockdev comparev and writev ...passed 00:33:03.156 Test: blockdev nvme passthru rw ...passed 00:33:03.156 Test: blockdev nvme passthru vendor specific ...passed 00:33:03.156 Test: blockdev nvme admin passthru ...passed 00:33:03.156 Test: blockdev copy ...passed 00:33:03.156 Suite: bdevio tests on: crypto_ram 00:33:03.156 Test: blockdev write read block ...passed 00:33:03.156 Test: blockdev write zeroes read block ...passed 00:33:03.156 Test: blockdev write zeroes read no split ...passed 00:33:03.156 Test: blockdev write zeroes read split ...passed 00:33:03.156 Test: blockdev write zeroes read split partial ...passed 00:33:03.156 Test: blockdev reset ...passed 00:33:03.156 Test: blockdev write read 8 blocks ...passed 00:33:03.156 Test: blockdev write read size > 128k ...passed 00:33:03.156 Test: blockdev write read invalid size ...passed 00:33:03.156 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:03.156 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:03.156 Test: blockdev write read max offset ...passed 00:33:03.156 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:03.156 Test: blockdev writev readv 8 blocks ...passed 00:33:03.156 Test: blockdev writev readv 30 x 1block ...passed 00:33:03.156 Test: blockdev writev readv block ...passed 00:33:03.156 Test: blockdev writev readv size > 128k ...passed 00:33:03.156 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:03.156 Test: blockdev comparev and writev ...passed 00:33:03.156 Test: blockdev nvme passthru rw ...passed 00:33:03.156 Test: blockdev nvme passthru vendor specific ...passed 00:33:03.156 Test: blockdev nvme admin passthru ...passed 00:33:03.156 Test: blockdev copy ...passed 00:33:03.156 00:33:03.156 Run Summary: Type Total Ran Passed Failed Inactive 00:33:03.156 suites 4 4 n/a 0 0 00:33:03.156 tests 92 92 92 0 0 00:33:03.156 asserts 520 520 520 0 n/a 00:33:03.156 00:33:03.156 Elapsed time = 0.522 seconds 00:33:03.156 0 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2266776 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2266776 ']' 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2266776 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:03.156 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2266776 00:33:03.414 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:03.414 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:03.414 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2266776' 00:33:03.414 killing process with pid 2266776 00:33:03.414 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2266776 00:33:03.414 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2266776 00:33:03.672 13:51:42 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:03.672 00:33:03.672 real 0m3.621s 00:33:03.672 user 0m10.083s 00:33:03.672 sys 0m0.596s 00:33:03.672 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:03.672 13:51:42 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:03.672 ************************************ 00:33:03.672 END TEST bdev_bounds 00:33:03.672 ************************************ 00:33:03.672 13:51:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:03.672 13:51:43 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:03.672 13:51:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:03.672 13:51:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:03.672 13:51:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:03.672 ************************************ 00:33:03.672 START TEST bdev_nbd 00:33:03.672 ************************************ 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2267287 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2267287 /var/tmp/spdk-nbd.sock 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2267287 ']' 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:03.672 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:03.672 13:51:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:03.929 [2024-07-15 13:51:43.147233] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:33:03.929 [2024-07-15 13:51:43.147299] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:03.929 [2024-07-15 13:51:43.276241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:04.186 [2024-07-15 13:51:43.378908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.186 [2024-07-15 13:51:43.400209] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:04.186 [2024-07-15 13:51:43.408231] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:04.186 [2024-07-15 13:51:43.416251] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:04.186 [2024-07-15 13:51:43.518968] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:06.715 [2024-07-15 13:51:45.726514] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:06.715 [2024-07-15 13:51:45.726580] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:06.715 [2024-07-15 13:51:45.726596] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.715 [2024-07-15 13:51:45.734535] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:06.715 [2024-07-15 13:51:45.734555] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:06.715 [2024-07-15 13:51:45.734567] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.715 [2024-07-15 13:51:45.742556] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:06.715 [2024-07-15 13:51:45.742573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:06.715 [2024-07-15 13:51:45.742585] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.715 [2024-07-15 13:51:45.750577] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:06.715 [2024-07-15 13:51:45.750594] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:06.715 [2024-07-15 13:51:45.750605] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:06.715 13:51:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:06.715 1+0 records in 00:33:06.715 1+0 records out 00:33:06.715 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313299 s, 13.1 MB/s 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:06.715 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:06.973 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.230 1+0 records in 00:33:07.230 1+0 records out 00:33:07.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342803 s, 11.9 MB/s 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:07.230 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.488 1+0 records in 00:33:07.488 1+0 records out 00:33:07.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290703 s, 14.1 MB/s 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:07.488 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:07.489 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.746 1+0 records in 00:33:07.746 1+0 records out 00:33:07.746 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000364896 s, 11.2 MB/s 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.746 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:07.747 13:51:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:07.747 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:07.747 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:07.747 13:51:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd0", 00:33:08.005 "bdev_name": "crypto_ram" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd1", 00:33:08.005 "bdev_name": "crypto_ram1" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd2", 00:33:08.005 "bdev_name": "crypto_ram2" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd3", 00:33:08.005 "bdev_name": "crypto_ram3" 00:33:08.005 } 00:33:08.005 ]' 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd0", 00:33:08.005 "bdev_name": "crypto_ram" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd1", 00:33:08.005 "bdev_name": "crypto_ram1" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd2", 00:33:08.005 "bdev_name": "crypto_ram2" 00:33:08.005 }, 00:33:08.005 { 00:33:08.005 "nbd_device": "/dev/nbd3", 00:33:08.005 "bdev_name": "crypto_ram3" 00:33:08.005 } 00:33:08.005 ]' 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.005 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.263 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.520 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:08.777 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:08.777 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:08.777 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.778 13:51:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.035 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:09.292 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:09.293 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:09.551 /dev/nbd0 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:09.551 1+0 records in 00:33:09.551 1+0 records out 00:33:09.551 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312922 s, 13.1 MB/s 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:09.551 13:51:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:33:09.810 /dev/nbd1 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:09.810 1+0 records in 00:33:09.810 1+0 records out 00:33:09.810 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261942 s, 15.6 MB/s 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:09.810 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:33:10.068 /dev/nbd10 00:33:10.068 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:10.068 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:10.068 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:10.068 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:10.069 1+0 records in 00:33:10.069 1+0 records out 00:33:10.069 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029932 s, 13.7 MB/s 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:10.069 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:33:10.327 /dev/nbd11 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:10.327 1+0 records in 00:33:10.327 1+0 records out 00:33:10.327 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352492 s, 11.6 MB/s 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.327 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd0", 00:33:10.585 "bdev_name": "crypto_ram" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd1", 00:33:10.585 "bdev_name": "crypto_ram1" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd10", 00:33:10.585 "bdev_name": "crypto_ram2" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd11", 00:33:10.585 "bdev_name": "crypto_ram3" 00:33:10.585 } 00:33:10.585 ]' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd0", 00:33:10.585 "bdev_name": "crypto_ram" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd1", 00:33:10.585 "bdev_name": "crypto_ram1" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd10", 00:33:10.585 "bdev_name": "crypto_ram2" 00:33:10.585 }, 00:33:10.585 { 00:33:10.585 "nbd_device": "/dev/nbd11", 00:33:10.585 "bdev_name": "crypto_ram3" 00:33:10.585 } 00:33:10.585 ]' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:10.585 /dev/nbd1 00:33:10.585 /dev/nbd10 00:33:10.585 /dev/nbd11' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:10.585 /dev/nbd1 00:33:10.585 /dev/nbd10 00:33:10.585 /dev/nbd11' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:10.585 256+0 records in 00:33:10.585 256+0 records out 00:33:10.585 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107271 s, 97.8 MB/s 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:10.585 13:51:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:10.844 256+0 records in 00:33:10.844 256+0 records out 00:33:10.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0801844 s, 13.1 MB/s 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:10.844 256+0 records in 00:33:10.844 256+0 records out 00:33:10.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0661395 s, 15.9 MB/s 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:10.844 256+0 records in 00:33:10.844 256+0 records out 00:33:10.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0553568 s, 18.9 MB/s 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:10.844 256+0 records in 00:33:10.844 256+0 records out 00:33:10.844 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0557935 s, 18.8 MB/s 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:10.844 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.106 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.384 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.659 13:51:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:11.659 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:11.916 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:12.173 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:12.430 malloc_lvol_verify 00:33:12.430 13:51:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:12.687 fc233f27-22e8-4540-ab5b-8eb5f3947572 00:33:12.687 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:12.945 9af9fb66-3268-4812-8b45-aa4e6596431a 00:33:12.945 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:13.202 /dev/nbd0 00:33:13.202 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:13.202 mke2fs 1.46.5 (30-Dec-2021) 00:33:13.202 Discarding device blocks: 0/4096 done 00:33:13.202 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:13.202 00:33:13.202 Allocating group tables: 0/1 done 00:33:13.202 Writing inode tables: 0/1 done 00:33:13.202 Creating journal (1024 blocks): done 00:33:13.203 Writing superblocks and filesystem accounting information: 0/1 done 00:33:13.203 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.203 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2267287 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2267287 ']' 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2267287 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:13.460 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2267287 00:33:13.718 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:13.718 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:13.718 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2267287' 00:33:13.718 killing process with pid 2267287 00:33:13.718 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2267287 00:33:13.718 13:51:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2267287 00:33:13.976 13:51:53 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:13.976 00:33:13.976 real 0m10.269s 00:33:13.976 user 0m13.494s 00:33:13.976 sys 0m4.047s 00:33:13.976 13:51:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:13.976 13:51:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:13.976 ************************************ 00:33:13.976 END TEST bdev_nbd 00:33:13.976 ************************************ 00:33:13.976 13:51:53 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:13.976 13:51:53 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:13.976 13:51:53 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:33:13.976 13:51:53 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:33:13.976 13:51:53 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:13.976 13:51:53 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:13.976 13:51:53 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:13.976 13:51:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:14.235 ************************************ 00:33:14.235 START TEST bdev_fio 00:33:14.235 ************************************ 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:14.235 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:14.235 ************************************ 00:33:14.235 START TEST bdev_fio_rw_verify 00:33:14.235 ************************************ 00:33:14.235 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:14.236 13:51:53 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:14.493 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:14.493 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:14.493 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:14.493 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:14.493 fio-3.35 00:33:14.493 Starting 4 threads 00:33:29.359 00:33:29.359 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2269318: Mon Jul 15 13:52:06 2024 00:33:29.359 read: IOPS=20.0k, BW=78.2MiB/s (82.0MB/s)(782MiB/10001msec) 00:33:29.359 slat (usec): min=17, max=326, avg=69.22, stdev=29.83 00:33:29.359 clat (usec): min=21, max=2373, avg=379.07, stdev=206.57 00:33:29.359 lat (usec): min=66, max=2435, avg=448.29, stdev=216.88 00:33:29.359 clat percentiles (usec): 00:33:29.359 | 50.000th=[ 343], 99.000th=[ 906], 99.900th=[ 1074], 99.990th=[ 1401], 00:33:29.359 | 99.999th=[ 1778] 00:33:29.359 write: IOPS=22.1k, BW=86.2MiB/s (90.4MB/s)(843MiB/9769msec); 0 zone resets 00:33:29.359 slat (usec): min=23, max=490, avg=79.90, stdev=27.21 00:33:29.359 clat (usec): min=33, max=1429, avg=415.36, stdev=213.78 00:33:29.359 lat (usec): min=72, max=1599, avg=495.27, stdev=222.38 00:33:29.359 clat percentiles (usec): 00:33:29.359 | 50.000th=[ 388], 99.000th=[ 938], 99.900th=[ 1074], 99.990th=[ 1205], 00:33:29.359 | 99.999th=[ 1385] 00:33:29.359 bw ( KiB/s): min=70840, max=131048, per=97.57%, avg=86172.21, stdev=3357.68, samples=76 00:33:29.359 iops : min=17710, max=32762, avg=21543.05, stdev=839.42, samples=76 00:33:29.359 lat (usec) : 50=0.01%, 100=1.71%, 250=28.49%, 500=39.84%, 750=22.84% 00:33:29.359 lat (usec) : 1000=6.76% 00:33:29.359 lat (msec) : 2=0.35%, 4=0.01% 00:33:29.359 cpu : usr=99.59%, sys=0.01%, ctx=59, majf=0, minf=231 00:33:29.359 IO depths : 1=6.6%, 2=26.7%, 4=53.4%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:29.359 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:29.359 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:29.359 issued rwts: total=200167,215695,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:29.359 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:29.359 00:33:29.359 Run status group 0 (all jobs): 00:33:29.359 READ: bw=78.2MiB/s (82.0MB/s), 78.2MiB/s-78.2MiB/s (82.0MB/s-82.0MB/s), io=782MiB (820MB), run=10001-10001msec 00:33:29.359 WRITE: bw=86.2MiB/s (90.4MB/s), 86.2MiB/s-86.2MiB/s (90.4MB/s-90.4MB/s), io=843MiB (883MB), run=9769-9769msec 00:33:29.359 00:33:29.359 real 0m13.448s 00:33:29.359 user 0m45.866s 00:33:29.359 sys 0m0.483s 00:33:29.359 13:52:06 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:29.359 13:52:06 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:29.359 ************************************ 00:33:29.359 END TEST bdev_fio_rw_verify 00:33:29.359 ************************************ 00:33:29.359 13:52:06 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:29.359 13:52:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:29.359 13:52:06 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:29.359 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b382ef69-2c7d-5bc6-9a92-f3a59e67da95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b382ef69-2c7d-5bc6-9a92-f3a59e67da95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "042d19ae-d8e6-52f2-b71a-d086b99e0dc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "042d19ae-d8e6-52f2-b71a-d086b99e0dc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d9f3d429-8e2c-5aa7-8b37-4e43135081ed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d9f3d429-8e2c-5aa7-8b37-4e43135081ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:29.360 crypto_ram1 00:33:29.360 crypto_ram2 00:33:29.360 crypto_ram3 ]] 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b382ef69-2c7d-5bc6-9a92-f3a59e67da95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b382ef69-2c7d-5bc6-9a92-f3a59e67da95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "042d19ae-d8e6-52f2-b71a-d086b99e0dc9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "042d19ae-d8e6-52f2-b71a-d086b99e0dc9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d9f3d429-8e2c-5aa7-8b37-4e43135081ed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d9f3d429-8e2c-5aa7-8b37-4e43135081ed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa4cc03d-2cdf-5de7-8082-3ac85e8cea64",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:29.360 ************************************ 00:33:29.360 START TEST bdev_fio_trim 00:33:29.360 ************************************ 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:29.360 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:29.361 13:52:07 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:29.361 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:29.361 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:29.361 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:29.361 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:29.361 fio-3.35 00:33:29.361 Starting 4 threads 00:33:41.669 00:33:41.669 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2271105: Mon Jul 15 13:52:20 2024 00:33:41.669 write: IOPS=33.6k, BW=131MiB/s (138MB/s)(1313MiB/10001msec); 0 zone resets 00:33:41.669 slat (usec): min=18, max=408, avg=69.99, stdev=37.70 00:33:41.669 clat (usec): min=19, max=1738, avg=246.57, stdev=148.09 00:33:41.669 lat (usec): min=61, max=1892, avg=316.57, stdev=171.58 00:33:41.669 clat percentiles (usec): 00:33:41.669 | 50.000th=[ 215], 99.000th=[ 709], 99.900th=[ 766], 99.990th=[ 807], 00:33:41.669 | 99.999th=[ 1287] 00:33:41.669 bw ( KiB/s): min=131179, max=137120, per=100.00%, avg=134595.95, stdev=429.60, samples=76 00:33:41.669 iops : min=32794, max=34280, avg=33648.95, stdev=107.41, samples=76 00:33:41.669 trim: IOPS=33.6k, BW=131MiB/s (138MB/s)(1313MiB/10001msec); 0 zone resets 00:33:41.669 slat (nsec): min=6089, max=65663, avg=20092.77, stdev=7249.11 00:33:41.669 clat (usec): min=46, max=1893, avg=316.74, stdev=171.59 00:33:41.669 lat (usec): min=53, max=1919, avg=336.84, stdev=173.95 00:33:41.669 clat percentiles (usec): 00:33:41.669 | 50.000th=[ 277], 99.000th=[ 840], 99.900th=[ 898], 99.990th=[ 955], 00:33:41.669 | 99.999th=[ 1450] 00:33:41.669 bw ( KiB/s): min=131179, max=137120, per=100.00%, avg=134595.95, stdev=429.60, samples=76 00:33:41.669 iops : min=32794, max=34280, avg=33648.95, stdev=107.41, samples=76 00:33:41.670 lat (usec) : 20=0.01%, 50=0.01%, 100=6.78%, 250=44.43%, 500=37.27% 00:33:41.670 lat (usec) : 750=9.80%, 1000=1.71% 00:33:41.670 lat (msec) : 2=0.01% 00:33:41.670 cpu : usr=99.61%, sys=0.00%, ctx=46, majf=0, minf=94 00:33:41.670 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:41.670 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:41.670 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:41.670 issued rwts: total=0,336181,336182,0 short=0,0,0,0 dropped=0,0,0,0 00:33:41.670 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:41.670 00:33:41.670 Run status group 0 (all jobs): 00:33:41.670 WRITE: bw=131MiB/s (138MB/s), 131MiB/s-131MiB/s (138MB/s-138MB/s), io=1313MiB (1377MB), run=10001-10001msec 00:33:41.670 TRIM: bw=131MiB/s (138MB/s), 131MiB/s-131MiB/s (138MB/s-138MB/s), io=1313MiB (1377MB), run=10001-10001msec 00:33:41.670 00:33:41.670 real 0m13.423s 00:33:41.670 user 0m45.628s 00:33:41.670 sys 0m0.477s 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:41.670 ************************************ 00:33:41.670 END TEST bdev_fio_trim 00:33:41.670 ************************************ 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:41.670 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:41.670 00:33:41.670 real 0m27.183s 00:33:41.670 user 1m31.656s 00:33:41.670 sys 0m1.128s 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:41.670 ************************************ 00:33:41.670 END TEST bdev_fio 00:33:41.670 ************************************ 00:33:41.670 13:52:20 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:41.670 13:52:20 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:41.670 13:52:20 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:41.670 13:52:20 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:41.670 13:52:20 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:41.670 13:52:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:41.670 ************************************ 00:33:41.670 START TEST bdev_verify 00:33:41.670 ************************************ 00:33:41.670 13:52:20 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:41.670 [2024-07-15 13:52:20.708406] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:33:41.670 [2024-07-15 13:52:20.708465] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2272428 ] 00:33:41.670 [2024-07-15 13:52:20.835826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:41.670 [2024-07-15 13:52:20.937116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:41.670 [2024-07-15 13:52:20.937123] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:41.670 [2024-07-15 13:52:20.958479] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:41.670 [2024-07-15 13:52:20.966513] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:41.670 [2024-07-15 13:52:20.974541] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:41.670 [2024-07-15 13:52:21.072569] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:44.227 [2024-07-15 13:52:23.267871] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:44.227 [2024-07-15 13:52:23.267961] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:44.227 [2024-07-15 13:52:23.267977] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.227 [2024-07-15 13:52:23.275891] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:44.227 [2024-07-15 13:52:23.275910] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:44.227 [2024-07-15 13:52:23.275922] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.227 [2024-07-15 13:52:23.283913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:44.227 [2024-07-15 13:52:23.283933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:44.227 [2024-07-15 13:52:23.283945] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.227 [2024-07-15 13:52:23.291942] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:44.227 [2024-07-15 13:52:23.291959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:44.227 [2024-07-15 13:52:23.291971] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.227 Running I/O for 5 seconds... 00:33:49.516 00:33:49.516 Latency(us) 00:33:49.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:49.516 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x0 length 0x1000 00:33:49.516 crypto_ram : 5.08 499.60 1.95 0.00 0.00 255041.11 5470.83 186920.07 00:33:49.516 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x1000 length 0x1000 00:33:49.516 crypto_ram : 5.08 504.35 1.97 0.00 0.00 253095.74 5214.39 186008.26 00:33:49.516 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x0 length 0x1000 00:33:49.516 crypto_ram1 : 5.08 502.33 1.96 0.00 0.00 253183.07 7693.36 174154.80 00:33:49.516 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x1000 length 0x1000 00:33:49.516 crypto_ram1 : 5.08 504.02 1.97 0.00 0.00 252411.39 5670.29 174154.80 00:33:49.516 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x0 length 0x1000 00:33:49.516 crypto_ram2 : 5.05 3874.51 15.13 0.00 0.00 32712.66 8491.19 28493.91 00:33:49.516 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x1000 length 0x1000 00:33:49.516 crypto_ram2 : 5.06 3896.87 15.22 0.00 0.00 32537.00 6097.70 28607.89 00:33:49.516 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x0 length 0x1000 00:33:49.516 crypto_ram3 : 5.06 3882.36 15.17 0.00 0.00 32547.82 1460.31 28379.94 00:33:49.516 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:49.516 Verification LBA range: start 0x1000 length 0x1000 00:33:49.516 crypto_ram3 : 5.06 3895.67 15.22 0.00 0.00 32437.64 5613.30 28265.96 00:33:49.516 =================================================================================================================== 00:33:49.516 Total : 17559.72 68.59 0.00 0.00 57930.43 1460.31 186920.07 00:33:49.516 00:33:49.516 real 0m8.242s 00:33:49.516 user 0m15.635s 00:33:49.516 sys 0m0.380s 00:33:49.516 13:52:28 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:49.516 13:52:28 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:49.516 ************************************ 00:33:49.516 END TEST bdev_verify 00:33:49.516 ************************************ 00:33:49.516 13:52:28 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:49.516 13:52:28 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:49.516 13:52:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:49.516 13:52:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:49.516 13:52:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:49.775 ************************************ 00:33:49.775 START TEST bdev_verify_big_io 00:33:49.775 ************************************ 00:33:49.775 13:52:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:49.775 [2024-07-15 13:52:29.036044] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:33:49.775 [2024-07-15 13:52:29.036105] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2273482 ] 00:33:49.775 [2024-07-15 13:52:29.163166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:50.034 [2024-07-15 13:52:29.265972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:50.034 [2024-07-15 13:52:29.265978] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:50.034 [2024-07-15 13:52:29.287361] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:50.034 [2024-07-15 13:52:29.295390] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:50.034 [2024-07-15 13:52:29.303418] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:50.034 [2024-07-15 13:52:29.427044] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:52.570 [2024-07-15 13:52:31.635025] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:52.570 [2024-07-15 13:52:31.635108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:52.570 [2024-07-15 13:52:31.635123] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.570 [2024-07-15 13:52:31.643029] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:52.570 [2024-07-15 13:52:31.643049] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:52.570 [2024-07-15 13:52:31.643061] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.570 [2024-07-15 13:52:31.651050] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:52.570 [2024-07-15 13:52:31.651067] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:52.570 [2024-07-15 13:52:31.651079] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.570 [2024-07-15 13:52:31.659071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:52.570 [2024-07-15 13:52:31.659088] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:52.570 [2024-07-15 13:52:31.659099] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.570 Running I/O for 5 seconds... 00:33:53.138 [2024-07-15 13:52:32.514838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.515271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.515640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.516658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.520373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.520433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.520489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.520567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.521063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.521122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.138 [2024-07-15 13:52:32.521174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.521216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.521608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.521625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.521641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.521656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.525794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.526258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.526277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.526292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.526307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.529646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.529694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.529736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.529780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.530862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.534735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.535175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.535193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.535210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.535226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.538583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.538631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.538675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.538716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.539760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.542944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.542990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.543682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.544143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.544162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.544177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.544193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.547423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.547474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.547517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.547559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.548624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.551977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.552752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.553185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.553203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.553218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.553233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.556430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.556477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.556525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.556568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.557562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.560708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.560753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.560794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.560839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.139 [2024-07-15 13:52:32.561998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.565986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.566464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.566481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.566496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.566510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.569702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.569787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.569831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.569874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.570939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.401 [2024-07-15 13:52:32.574851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.574892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.575327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.575344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.575359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.575375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.578349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.578394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.578436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.578476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.578995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.579592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.582541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.582587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.582627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.582669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.583696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.586714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.586761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.586805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.586849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.587995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.591882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.592319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.592337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.592352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.592367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.595331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.595378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.595419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.595464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.595959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.596568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.599478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.599526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.599568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.599610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.600072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.600116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.600158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.402 [2024-07-15 13:52:32.600201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.600570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.600586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.600601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.600616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.603513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.603559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.603601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.603642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.604747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.607800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.607847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.607889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.607948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.608943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.611975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.612645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.613103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.613121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.613137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.613153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.616737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.617161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.617178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.617193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.617208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.620778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.621209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.621226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.621242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.621256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.624921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.625417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.625435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.625450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.625471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.627536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.627580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.627620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.627660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.403 [2024-07-15 13:52:32.628112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.628565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.630855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.630901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.630949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.630991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.631996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.632014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.632031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.633991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.634756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.635081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.635097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.635111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.635126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.637472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.637518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.637559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.637601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.638638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.640499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.640544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.640594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.640635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.641491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.643731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.643778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.643820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.643858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.644880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.647406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.648710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.650249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.651789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.652472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.652867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.404 [2024-07-15 13:52:32.653258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.653646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.654047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.654064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.654079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.654093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.657510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.659146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.660685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.661981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.662806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.663200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.663592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.664303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.664591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.664607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.664622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.664636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.667874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.669503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.671002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.671390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.672197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.672597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.673024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.674385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.674659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.674675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.674690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.674704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.678042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.679586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.679986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.680374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.681186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.681578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.683175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.684699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.684980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.684997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.685012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.685026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.688384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.689109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.689498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.689885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.690743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.692030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.693322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.694859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.695142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.695159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.695173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.695187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.698095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.698487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.698873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.699264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.700642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.701952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.703487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.704961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.705295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.705312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.705327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.705342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.707439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.707830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.708223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.708609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.710344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.711879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.713446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.405 [2024-07-15 13:52:32.714471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.714753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.714769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.714784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.714798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.717044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.717438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.717830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.718629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.720572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.722116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.723393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.724772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.725085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.725101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.725116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.725130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.727638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.728036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.728606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.729909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.731804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.733304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.734465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.735754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.736035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.736052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.736066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.736080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.738752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.739149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.740548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.742096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.743997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.744995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.746295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.747798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.748078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.748095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.748109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.748124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.750984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.752470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.754009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.755552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.756734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.758040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.759571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.761112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.761483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.761500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.761515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.761529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.765977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.767532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.769080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.770297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.771900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.773415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.774951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.775633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.776130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.776162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.776178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.776193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.780128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.781663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.782982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.784333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.786199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.787743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.788443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.788833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.789266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.789284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.789300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.406 [2024-07-15 13:52:32.789315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.793303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.794872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.795983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.797288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.799113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.800077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.800482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.800869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.801319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.801336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.801352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.801367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.804764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.805624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.806921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.808466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.809956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.810347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.810735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.811129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.811570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.811588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.811603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.811618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.813998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.815284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.816819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.818349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.819051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.819445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.819832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.820227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.820547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.820563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.820578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.820592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.407 [2024-07-15 13:52:32.823853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.825424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.826965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.828225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.829029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.829418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.670 [2024-07-15 13:52:32.829804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.830885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.831212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.831229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.831248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.831262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.834456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.836084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.837572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.837968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.838789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.839185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.840041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.841345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.841621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.841638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.841652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.841666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.844910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.846456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.846852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.847250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.848034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.848611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.849918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.851448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.851725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.851741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.851756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.851770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.855059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.855636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.856036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.856425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.857284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.858734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.860277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.861823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.862106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.862125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.862141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.862156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.864590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.864987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.865373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.865760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.867824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.869455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.871004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.872364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.872694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.872711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.872725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.872739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.874830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.875228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.875617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.876047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.877661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.879188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.880813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.881879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.882225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.882241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.882256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.882275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.884535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.884952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.885379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.886740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.888541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.890159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.891189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.892491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.892770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.892787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.892801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.892815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.671 [2024-07-15 13:52:32.895329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.895722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.897201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.898737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.900552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.901435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.902744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.904293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.904567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.904584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.904598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.904612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.907335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.908700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.910242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.911782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.912892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.914194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.915730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.917275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.917714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.917732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.917747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.917761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.922132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.923765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.925312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.926688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.928390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.929936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.931472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.932190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.932705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.932722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.932739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.932754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.936540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.938077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.938874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.940183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.941968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.942361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.942748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.943140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.943564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.943582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.943598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.943614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.946393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.946788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.947194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.947596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.948448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.948837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.949232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.949632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.950063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.950080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.950095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.950109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.952942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.953338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.953724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.954113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.954950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.955346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.955734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.956125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.956556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.956572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.956587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.956602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.959285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.959673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.960064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.672 [2024-07-15 13:52:32.960462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.961279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.961667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.962867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.965727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.966139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.966533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.966570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.967410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.967800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.968196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.968591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.969047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.969064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.969079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.969093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.971833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.972231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.972619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.973029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.973076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.973518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.973945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.974341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.974732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.975126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.975609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.975626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.975641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.975662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.978778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.979216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.979234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.979250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.979265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.981697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.981745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.981786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.981827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.982911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.985955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.986508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.988902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.988954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.988996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.673 [2024-07-15 13:52:32.989039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.989994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.990011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.990026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.990041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.992946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.992992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.993763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.994218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.994235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.994250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.994265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.996618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.996663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.996719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.996771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:32.997863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.000935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.001372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.001389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.001405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.001420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.003841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.003887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.003937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.003978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.004396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.004448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.004493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.004536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.004579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.005009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.005027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.005041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.005055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.007979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.008021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.008458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.008478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.008494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.008509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.010783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.010829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.010870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.674 [2024-07-15 13:52:33.010913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.011398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.011450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.011496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.011538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.011581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.012015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.012033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.012048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.012062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.014446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.014491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.014532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.014572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.015633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.017989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.018778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.019169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.019186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.019202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.019216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.021601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.021647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.021689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.021740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.022842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.025892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.026344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.026361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.026377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.026392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.028756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.028809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.028870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.028924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.029996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.030012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.032310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.675 [2024-07-15 13:52:33.032354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.032413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.032455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.032919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.032979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.033544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.035921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.035971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.036686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.037196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.037213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.037229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.037244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.039912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.040360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.040378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.040393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.040411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.042996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.043410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.044979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.676 [2024-07-15 13:52:33.045819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.046271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.046290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.046305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.046321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.048874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.049151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.049168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.049183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.049197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.050881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.050934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.050976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.051562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.052049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.052068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.052084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.052100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.054836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.055113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.055130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.055144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.055159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.056829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.056880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.056932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.056973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.057961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.060784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.061062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.061079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.061093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.061107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.062797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.062841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.062882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.062923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.063202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.063260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.063302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.677 [2024-07-15 13:52:33.063352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.063394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.063774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.063794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.063809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.063824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.066745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.067022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.067039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.067054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.067068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.068759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.068803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.068849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.068890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.069846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.072824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.073099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.073116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.073131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.073145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.074820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.074867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.074907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.074954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.075872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.078523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.078568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.079867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.079913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.080694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.082348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.082391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.082438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.083965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.084378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.084432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.084473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.084514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.678 [2024-07-15 13:52:33.084557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.085003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.085021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.085036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.085051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.088536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.090066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.090772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.679 [2024-07-15 13:52:33.092083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.092358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.093920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.095282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.095669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.096063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.096486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.096502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.096521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.096536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.099895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.100616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.101932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.103468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.103740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.105304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.105693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.106084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-15 13:52:33.106470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.106907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.106924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.106945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.106960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.109470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.111055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.112676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.114216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.114490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.114895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.115288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.115680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.116069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.116397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.116414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.116429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.116443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.119463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.120835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.953 [2024-07-15 13:52:33.122213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.123740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.124187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.124593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.124986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.125370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.126274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.126587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.126603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.126618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.126632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.129527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.131056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.132586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.133129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.133587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.133994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.134383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.135014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.136312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.136585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.136602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.136616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.954 [2024-07-15 13:52:33.136631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.139769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.141317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.142096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.142486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.142917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.143325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.143715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.145059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.146596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.146869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.146886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.146900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.146915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.150122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.151075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.151473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.151859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.152389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.152789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.154339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.155950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.955 [2024-07-15 13:52:33.157487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.157759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.157775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.157790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.157805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.160536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.160940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.161328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.161722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.162172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.163796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.165353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.166993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.168435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.168771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.168787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.168802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.168820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.170782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.171177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.171567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.171959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.956 [2024-07-15 13:52:33.172229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.173536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.175069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.176578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.177411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.177754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.177770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.177784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.177799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.179870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.180266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.180653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.182125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.182466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.184040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.185575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.186301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.187601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.187873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.187890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.187905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.187919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.190144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.190533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.191875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.193179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.193451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.957 [2024-07-15 13:52:33.195019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.195721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.197075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.198612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.198885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.198901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.198916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.198935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.201444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.202751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.204055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.205572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.205847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.958 [2024-07-15 13:52:33.206641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.208068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.209585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.211119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.211394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.211411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.211426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.211441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.214955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.216256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.217794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.219343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.219768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.221130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.222669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.224197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.225285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.225695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.225711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.225726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.225741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.229203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.230741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.232286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.233170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.233488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.235057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.236593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.237711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.238102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.238551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.238569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.238585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.238600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.242159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.243696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.244458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.245760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.246038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.247584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.248857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.249247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.249633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.250061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.250078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.250093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.250110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.253416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.254143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.255445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.256970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.257244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.258791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.259187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.259574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.259969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.260407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.260424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.260440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.260455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.959 [2024-07-15 13:52:33.262867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.264377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.265957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.267494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.267767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.268179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.268567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.268959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.269348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.269671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.269687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.269701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.269715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.272842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.274314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.275940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.277455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.277876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.278285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.278672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.279064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.280272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.280596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.280612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.280626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.280640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.283555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.285104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.286724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.287122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.287559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.287963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.288348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.289387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.290690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.290967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.290984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.290999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.291013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.294152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.295688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.296089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.296488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.296878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.297283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.298102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.299400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.300944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.301224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.301241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.301256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.301270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.304424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.304962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.305349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.305734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.306192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.306825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.308132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.309665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.311204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.311540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.311557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.311572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.311586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.313892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.314288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.314672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.315062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.315473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.316923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.318460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.319999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.321213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.321509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.321526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.321540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.321554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.323581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.323977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.324366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.324758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.325037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.326609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.328165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.328578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.330162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.330435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.330451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.330466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.330480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.332979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.333380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.333773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.334166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.334615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.335020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.335409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.335799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.336201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.336622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.336639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.336654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.336669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.339393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.339801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.340193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.340578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.341020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.341432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.341824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.960 [2024-07-15 13:52:33.342218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.342607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.343062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.343080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.343096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.343111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.345768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.346164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.346553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.346957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.347404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.347807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.348197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.348581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.348975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.349328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.349345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.349360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.349375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.352419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.352817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.353228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.353616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.354098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.354495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.354891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.355288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.355678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.356103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.356120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.356135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.356150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.358892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.359290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.359678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.360087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.360451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.360854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.361249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.361636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.362031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.362390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.362407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.362424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.362439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.365166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.365567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.365966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.366356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.366791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.367212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.367601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.367997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.368391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.368846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.368863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.368878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.961 [2024-07-15 13:52:33.368896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.371584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.371993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.372377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.372767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.373187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.373592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.373986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.374371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.374754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.375205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.375223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.375238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.375253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.377885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.378289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.378340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.378737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.379116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.379521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.379908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.380301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.380691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.381081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.381099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.381114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.381128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.383951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.384354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.226 [2024-07-15 13:52:33.384747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.384791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.385238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.385642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.386035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.386424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.386818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.387265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.387283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.387298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.387312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.389646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.389691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.389745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.389785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.390954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.393965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.394404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.394426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.394441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.394456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.396792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.396837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.396877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.396919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.397386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.397438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.397481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.397523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.397567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.398002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.398019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.398033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.398048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.400970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.401583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.403947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.403992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.404708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.405149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.405167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.405181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.405195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.407483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.407528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.407569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.407609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.408630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.410895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.410946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.410988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.411710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.412152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.412169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.227 [2024-07-15 13:52:33.412184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.412198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.414729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.414776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.414821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.414863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.415872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.418943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.419250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.419267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.419289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.419303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.421629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.421677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.421718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.421759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.422814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.425985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.426000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.426014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.427713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.427757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.427805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.427845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.428616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.431997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.432013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.432028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.432042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.433672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.433717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.433762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.433803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.434577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.436787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.436831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.436872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.436914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.437370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.437423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.228 [2024-07-15 13:52:33.437469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.437884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.439525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.439569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.439613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.439653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.439987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.440492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.442636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.442680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.442721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.442761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.443710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.445965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.446237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.446253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.446267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.446281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.448978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.449542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.451995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.453981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.454711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.455166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.455187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.455203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.455218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.456831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.456884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.456932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.456975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.457847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.459692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.459737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.229 [2024-07-15 13:52:33.459781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.459823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.460910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.462519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.462564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.462609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.462650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.463550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.465998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.466040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.466542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.466561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.466577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.466592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.468903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.469183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.469200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.469215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.469229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.470875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.470936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.470979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.471656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.472042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.472059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.472074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.472088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.473981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.474987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.475002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.476527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.476572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.476613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.476653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.477114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.230 [2024-07-15 13:52:33.477171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.477786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.479802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.479851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.479892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.479936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.480768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.482321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.482365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.482757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.482804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.483880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.485755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.485798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.485839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.487962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.488314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.488330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.488345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.488359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.490566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.490962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.491351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.492964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.493244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.494785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.496326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.497111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.498405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.498680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.498697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.498712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.498726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.501051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.501447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.502737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.504031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.504316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.505884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.506600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.507978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.509516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.509793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.509809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.509824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.509838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.512484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.513944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.515322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.516858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.517137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.517864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.519165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.520696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.522240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.522542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.522559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.522574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.522589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.526196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.527502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.529037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.530561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.531006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.532349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.533887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.535434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.536517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.536944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.536961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.536975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.536990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.231 [2024-07-15 13:52:33.540531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.542063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.543689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.544695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.545019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.546572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.548061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.549999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.553482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.555040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.555728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.557045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.557319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.558886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.560202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.560592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.560991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.561427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.561443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.561459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.561474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.564847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.565542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.566838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.568384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.568661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.570005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.570395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.570785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.571181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.571641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.571660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.571675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.571692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.574100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.575584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.577132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.578662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.578949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.579362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.579748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.580858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.583883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.585416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.586948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.588030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.588448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.588857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.589255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.589648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.591111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.591429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.591446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.591460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.591475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.594643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.596230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.597619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.598013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.598476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.598900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.599301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.600696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.602014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.602293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.602314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.602328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.602342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.605588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.606857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.607258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.607650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.608069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.608483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.609814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.611133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.612670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.612954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.612971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.612986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.613000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.616215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.616614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.617011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.617399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.617859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.619180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.620471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.622002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.232 [2024-07-15 13:52:33.623562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.624012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.624029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.624044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.624058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.626084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.626501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.626891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.627289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.627564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.628872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.630425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.631971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.632730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.633060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.633079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.633093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.633108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.635208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.635602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.635998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.637509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.637852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.639423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.640976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.641669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.642969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.643248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.643265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.643279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.643293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.645686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.233 [2024-07-15 13:52:33.646088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.647442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.648736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.649019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.650588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.651295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.652602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.654133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.654414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.654431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.654445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.654460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.657160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.658648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.660074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.661655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.661942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.662651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.663961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.665480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.667026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.667352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.667370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.667385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.667400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.671501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.673052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.674676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.676104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.676439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.677735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.679277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.680824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.681604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.682118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.682144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.682160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.497 [2024-07-15 13:52:33.682175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.685934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.687490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.688821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.690181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.690494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.692069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.693614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.694247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.694638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.695102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.695120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.695135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.695149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.698841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.699898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.701215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.702646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.703011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.703420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.703804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.704940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.708111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.709687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.711241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.712388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.712758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.713173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.713565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.713963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.714357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.714768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.714786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.714801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.714816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.718021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.718423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.718812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.719207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.719679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.720094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.720488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.720891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.721291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.721716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.721733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.721749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.721763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.724425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.724821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.725221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.725609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.726032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.726438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.726825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.727222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.727608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.727984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.728002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.728017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.728032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.730744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.731151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.731551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.731950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.732403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.732820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.733220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.733611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.734012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.734498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.734515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.734531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.734547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.737253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.737653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.738049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.738436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.738873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.739287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.739685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.740081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.740475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.740949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.740968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.740989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.741004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.743673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.744074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.744466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.744858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.745256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.745661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.498 [2024-07-15 13:52:33.746059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.746447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.746836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.747196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.747213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.747229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.747243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.750101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.750498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.750900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.751297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.751777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.752191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.752583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.752986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.753387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.753825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.753845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.753860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.753875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.756762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.757167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.757559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.757964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.758404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.758811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.759211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.759598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.759992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.760378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.760395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.760409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.760424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.763154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.763554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.763957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.764348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.764793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.765200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.765587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.765984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.766382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.766845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.766862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.766876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.766891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.769634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.770037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.770425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.770810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.771235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.771639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.772051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.772449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.772839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.773324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.773342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.773358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.773373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.776138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.776530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.776919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.777321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.777701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.778118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.778509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.778896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.779292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.779672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.779689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.779704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.779720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.782538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.782950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.783001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.783407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.783864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.784282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.784675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.785078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.785473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.785984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.786003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.786018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.786038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.788764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.789168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.789558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.789606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.790064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.790468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.790864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.791266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.791656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.792119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.792137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.792152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.792167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.499 [2024-07-15 13:52:33.794462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.794511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.794552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.794597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.795757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.797621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.797668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.797709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.797756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.798858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.801470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.801516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.801557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.801597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.802702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.804569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.804614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.804654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.804702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.804983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.805574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.807996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.808546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.810989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.811030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.811486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.811503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.811517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.811531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.813991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.814034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.814442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.814458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.814473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.814488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.816980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.817021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.500 [2024-07-15 13:52:33.817416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.817434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.817449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.817463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.819852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.820291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.820309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.820325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.820339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.822550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.822603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.822647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.822688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.822966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.823529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.825844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.826316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.826334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.826352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.826368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.828975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.829016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.829289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.829306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.829321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.829336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.830980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.831707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.832172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.832190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.832206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.832224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.834874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.835158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.835176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.835190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.835204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.836963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.837008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.501 [2024-07-15 13:52:33.837049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.837708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.838210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.838228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.838243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.838258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.840993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.841034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.841310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.841326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.841341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.841355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.843827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.844369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.844391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.844406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.844422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.846584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.846647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.846690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.846730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.847517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.849831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.850247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.850264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.850279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.850294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.852573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.852618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.852670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.852713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.852998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.853506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.855853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.856233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.856251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.856266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.856280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.858531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.858576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.858617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.858657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.858990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.502 [2024-07-15 13:52:33.859483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.859497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.861841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.862258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.862275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.862290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.862305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.864741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.864785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.864829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.864870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.865684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.867987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.868028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.868397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.868418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.868433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.868447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.871894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.872173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.872190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.872205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.872219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.873964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.874934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.877978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.878019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.878293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.878309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.878325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.878339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.879998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.880921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.503 [2024-07-15 13:52:33.883897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.883947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.884220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.884236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.884251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.884265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.886009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.886054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.887589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.887635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.887907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.887976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.888654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.891013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.891059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.891100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.892849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.893128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.893145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.893164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.893178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.896297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.896695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.897092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.897480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.897942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.899088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.900386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.901909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.903435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.903914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.903938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.903953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.903967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.905916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.906321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.906709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.907106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.907430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.908741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.910286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.911821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.912550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.912826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.912843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.912858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.912872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.914984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.915377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.915768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.916829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.504 [2024-07-15 13:52:33.917175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.918736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.920280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.921182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.922732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.923017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.923034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.923048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.923063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.925471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.925867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.926893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.928211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.928486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.930053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.931024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.932653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.934279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.765 [2024-07-15 13:52:33.934554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.934570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.934585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.934599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.937120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.938150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.939449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.940978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.941253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.942283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.943909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.945546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.947097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.947372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.947388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.947402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.947416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.950811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.952117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.953485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.955026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.955396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.956957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.958456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.960084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.961550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.961916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.961938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.961953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.961968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.965558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.967111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.968641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.969336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.969611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.971202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.972773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.974175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.974565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.975007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.975025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.975045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.975060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.978641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.980188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.980897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.982204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.982477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.984118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.985572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.985969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.986364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.986771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.986787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.986802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.986817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.990229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.990957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.992276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.993791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.994073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.995720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.996120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.996509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.996899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.997333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.997351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.997367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.997381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:33.999982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.001602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.003221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.004759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.005041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.005448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.005838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.006234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.006621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.006973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.006990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.007006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.007020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.010260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.011766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.013391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.014869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.015249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.015655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.016049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.016436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.017455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.017778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.017794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.017809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.017823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.020792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.022345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.023966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.024360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.766 [2024-07-15 13:52:34.024827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.025238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.025630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.026514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.027822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.028104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.028120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.028135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.028149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.031357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.032900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.033304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.033695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.034084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.034488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.035204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.036509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.038051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.038326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.038342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.038356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.038370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.041590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.042062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.042451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.042842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.043343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.043862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.045168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.046700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.048204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.048476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.048493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.048507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.048526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.050899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.051301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.051695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.052092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.052525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.053946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.055528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.057109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.057852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.058129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.058146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.058160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.058175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.060626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.061032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.062622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.064149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.064424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.065992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.066395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.067828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.069402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.069675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.069692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.069707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.069721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.072255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.072651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.073050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.073445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.073890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.074300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.074689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.075935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.078661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.079067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.079456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.079848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.080279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.080682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.081081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.081474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.081861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.082281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.082298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.082313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.082327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.085048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.085460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.085853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.086253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.086637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.087058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.087447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.087838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.088242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.088627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.088644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.088658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.088673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.091447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.767 [2024-07-15 13:52:34.091858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.092260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.092650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.093057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.093462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.093853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.094264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.094667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.095126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.095143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.095159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.095174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.097797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.098196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.098586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.098980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.099377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.099782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.100179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.100568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.100963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.101377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.101394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.101410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.101425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.104080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.104474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.104862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.105261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.105714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.106125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.106517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.106905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.107300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.107662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.107678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.107693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.107708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.110856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.111264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.111660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.112052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.112527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.112935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.113330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.113720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.115293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.115810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.115828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.115843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.115858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.118455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.119031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.120256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.120646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.121042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.122416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.122809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.123204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.123601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.123969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.123986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.124000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.124015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.126424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.126821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.127226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.128426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.128804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.129216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.130297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.131796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.135171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.135688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.136086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.136475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.136886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.138309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.138706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.139101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.140753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.141247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.141266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.141281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.141297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.143777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.144335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.145577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.145977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.146381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.146801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.147388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.148603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.149006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.149379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.768 [2024-07-15 13:52:34.149395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.149410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.149424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.152479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.153411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.153805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.154981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.155333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.155740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.156136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.156530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.157800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.158202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.158219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.158234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.158249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.160941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.161345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.162825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.163228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.163668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.165018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.165466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.165853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.166257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.166669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.166686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.166701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.166716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.169099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.169496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.169890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.170291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.170564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.170978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.171368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.173887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.176391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.177754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.177803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.178196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.178618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.180269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.180718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.182052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.183293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.183663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.183680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.183695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.183710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.186292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.186693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.187093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.187143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.187583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.187993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:54.769 [2024-07-15 13:52:34.188382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.188772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.189178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.189650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.189667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.189682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.189697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.191904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.191959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.192812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.194544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.194609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.194651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.194691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.194967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.195652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.197833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.197885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.197935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.197979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.198749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.200986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.201500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.203814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.203859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.203899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.203952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.204740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.206996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.207571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.209986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.032 [2024-07-15 13:52:34.210883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.210900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.210914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.210933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.212699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.212744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.212788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.212829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.213723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.216756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.217028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.217044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.217059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.217073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.218811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.218856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.218897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.218943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.219795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.222548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.222593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.222638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.222678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.222996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.223496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.225878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.226194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.226211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.226225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.226240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.228742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.228791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.228832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.228872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.229667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.231966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.232247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.232264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.232279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.232295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.234755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.234802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.033 [2024-07-15 13:52:34.234854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.234897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.235684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.237941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.238214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.238230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.238245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.238259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.241982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.246415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.246466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.246509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.246555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.246985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.247660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.251810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.251867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.251908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.251960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.252736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.256561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.256610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.256664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.256706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.256985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.257490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.262914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.263262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.263279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.263294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.263308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.266987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.034 [2024-07-15 13:52:34.267853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.267869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.267883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.267906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.272851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.273129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.273147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.273162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.273176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.277716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.277766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.277813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.277862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.278961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.282499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.282549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.282589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.282635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.283670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.287651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.287701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.287743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.287785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.288742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.293518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.293567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.293631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.293674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.293950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.294588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.298712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.298763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.298817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.298859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.299633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.303580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.303632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.303673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.303714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.304656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.308308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.308358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.308399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.035 [2024-07-15 13:52:34.308438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.308708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.308766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.308807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.308847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.308887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.309266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.309284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.309299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.309314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.312262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.312315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.313854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.313899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.314707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.316454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.316499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.316539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.316934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.317994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.318009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.320326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.321636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.323183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.324729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.325047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.325453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.325842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.326945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.329955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.331497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.333039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.334112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.334536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.334942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.335329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.335715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.337286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.337584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.337600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.337615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.337629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.340946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.342504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.343572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.343972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.344402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.344802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.345198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.346743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.348233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.348506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.348522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.348537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.348551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.351839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.352919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.353316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.353704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.354150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.354550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.356172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.036 [2024-07-15 13:52:34.357720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.359372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.359644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.359661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.359675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.359689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.362483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.362876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.363270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.363660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.364101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.365705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.367240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.368867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.370336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.370692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.370709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.370723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.370738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.372787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.373188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.373575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.373968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.374240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.375551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.377103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.378623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.379450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.379778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.379795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.379809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.379823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.382026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.382417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.382804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.384431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.384708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.386274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.387822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.388589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.389894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.390172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.390189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.390204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.390218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.392638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.393042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.394687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.396307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.396582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.398144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.398961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.400258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.401808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.402084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.402101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.402115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.402129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.404880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.406380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.407944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.409475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.409747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.410671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.411978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.413510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.415052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.415453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.415470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.415485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.415500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.419524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.421084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.422630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.423846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.424146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.425451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.426996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.428544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.429103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.429568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.429585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.429600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.429616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.433322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.434806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.436024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.437329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.437599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.438711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.440195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.440603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.440995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.441266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.441283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.441298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.441312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.444875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.037 [2024-07-15 13:52:34.446249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.447575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.448862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.449142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.450692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.451417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.451807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.452205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.452672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.452690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.452705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.038 [2024-07-15 13:52:34.452720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.455854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.456815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.458112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.459691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.459968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.461046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.461437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.461825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.462216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.462649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.462668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.462684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.462699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.465397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.465794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.466202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.466596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.467051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.467456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.467844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.468239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.468632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.469000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.469018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.469033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.469048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.471746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.472144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.472534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.472923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.473356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.473756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.474156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.474547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.474936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.475371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.475388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.475403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.475419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.478032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.301 [2024-07-15 13:52:34.478439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.478831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.479230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.479603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.480214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.480605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.481802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.484580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.484984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.485376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.485761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.486224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.486621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.487017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.487411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.487817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.488248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.488266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.488281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.488297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.491273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.491665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.492057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.492448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.492865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.493277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.493668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.494916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.497542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.497940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.498345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.498735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.499178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.499578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.499967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.500354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.500749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.501154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.501171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.501187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.501202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.503959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.504357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.504743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.505137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.505573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.505979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.506375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.506764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.507159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.507561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.507578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.507593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.507608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.510204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.510593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.510986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.511381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.511768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.512179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.512571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.512967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.513357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.513733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.513750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.513765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.513779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.516476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.516875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.517278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.517667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.518074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.518473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.518863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.519276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.519669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.520115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.520134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.520150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.520168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.522790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.523191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.523580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.523974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.524359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.524762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.525155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.302 [2024-07-15 13:52:34.525539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.525933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.526379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.526397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.526417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.526433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.529109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.529504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.529896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.530296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.530774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.531181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.531571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.531964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.532363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.532718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.532735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.532750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.532765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.535783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.536187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.536579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.536970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.537432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.537829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.538228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.538624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.539019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.539426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.539443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.539458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.539473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.543030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.543420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.543812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.544205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.544643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.545052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.545446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.545836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.546226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.546677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.546695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.546710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.546725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.549309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.549700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.550094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.551540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.551812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.553364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.554992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.555988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.557297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.557570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.557586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.557600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.557615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.560073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.560469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.562056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.563680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.563956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.565504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.566355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.567659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.569210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.569486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.569503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.569517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.569532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.572136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.573715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.575348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.576895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.577173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.577868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.579168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.580703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.582260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.582634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.582652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.582667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.582682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.586643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.588132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.589756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.591265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.591639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.592950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.594497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.596044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.596895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.597359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.597380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.597400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.597418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.602823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.603785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.303 [2024-07-15 13:52:34.603835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.605137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.605410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.606984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.607877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.608285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.608673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.609196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.609214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.609229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.609245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.612474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.613276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.614577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.614623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.614892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.616463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.617539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.617931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.618320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.618787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.618804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.618823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.618839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.620623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.620668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.620717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.620766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.621546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.623976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.624019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.624399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.624415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.624432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.624447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.626982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.627377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.627393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.627407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.627422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.628971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.629703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.630151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.630169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.630185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.630201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.632994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.633034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.633341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.633359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.633374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.633394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.634953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.634998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.635715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.636154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.304 [2024-07-15 13:52:34.636176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.636192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.636208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.638996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.639494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.641824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.642257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.642275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.642291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.642307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.644975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.645422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.646988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.647667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.648126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.648154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.648170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.648189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.650816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.651092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.651109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.651124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.651139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.652771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.652816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.652860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.652901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.305 [2024-07-15 13:52:34.653281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.653981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.656993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.658683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.658737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.658777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.658818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.659807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.661996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.662943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.664632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.664677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.664721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.664761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.665686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.667965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.668011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.668051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.306 [2024-07-15 13:52:34.668092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.668914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.670594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.670639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.670680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.670720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.670992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.671623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.674910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.675186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.675203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.675218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.675233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.676883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.676951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.677823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.680858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.681131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.681148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.681163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.681179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.682823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.682867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.682908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.682954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.683226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.307 [2024-07-15 13:52:34.683287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.683725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.686737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.687023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.687041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.687055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.687070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.688731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.688778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.688822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.688863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.689636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.692712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.693028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.693046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.693060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.693075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.694685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.694729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.694777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.694819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.695599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.697772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.697821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.697863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.697904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.698356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.698408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.698450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.308 [2024-07-15 13:52:34.698495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.698536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.698807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.698825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.698841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.698855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.700521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.700566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.700609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.700650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.700971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.701469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.703518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.703563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.703604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.703644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.704602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.706915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.707195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.707212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.707227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.707241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.709271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.709316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.709718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.709761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.710735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.712378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.712421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.712463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.713771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.309 [2024-07-15 13:52:34.714513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.714529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.714544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.714558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.717697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.719003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.310 [2024-07-15 13:52:34.720557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.722093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.722435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.723895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.725303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.726846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.728473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.728856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.728874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.728888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.728903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.732553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.734094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.735636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.736387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.736664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.738148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.739771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.741277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.741665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.742115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.742133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.742149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.742164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.745697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.747227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.748011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.749474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.749751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.751314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.752940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.753343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.753727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.754153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.754171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.754186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.754200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.757566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.758538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.760189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.761811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.762093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.763642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.764040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.764430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.764817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.765266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.765284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.765300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.765315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.768157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.769599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.770957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.772494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.772772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.773379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.773769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.774983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.777559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.778842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.780380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.781917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.782371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.782791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.783185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.783571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.784242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.784516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.784533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.784552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.784567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.787459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.788988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.790534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.791428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.791870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.792281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.792671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.793066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.794591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.794869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.794885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.794900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.794914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.798146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.798544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.571 [2024-07-15 13:52:34.798936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.799325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.799774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.800974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.802278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.803815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.805364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.805756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.805773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.805787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.805802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.807685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.808082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.808469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.808861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.809173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.810479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.812059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.813603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.814158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.814433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.814450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.814464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.814479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.816480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.816870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.817263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.817652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.818044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.818460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.818851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.819244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.819631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.820015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.820033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.820047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.820062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.822754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.823153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.823550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.823946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.824380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.824778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.825170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.825563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.825964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.826422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.826440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.826455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.826471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.829152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.829564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.829957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.830346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.830777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.831188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.831584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.831981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.832373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.832815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.832833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.832848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.832863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.835607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.836005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.836396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.836788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.837211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.837612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.838006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.838392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.838781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.839136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.839154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.839169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.839191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.842044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.842442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.842831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.843221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.843709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.844112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.844505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.844898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.845292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.845715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.845732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.845746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.845761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.848478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.848867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.849260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.849653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.850117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.850530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.850924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.851316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.851704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.852118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.852136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.852150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.852165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.854876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.855284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.855679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.856078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.856528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.856932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.857320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.857709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.858115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.858555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.858574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.858590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.858605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.861258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.861656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.862049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.862439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.862878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.863287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.863680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.864958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.867626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.868020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.868409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.868801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.869216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.869617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.572 [2024-07-15 13:52:34.870010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.870395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.870789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.871137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.871154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.871169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.871184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.873987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.874386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.874790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.875183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.875663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.876066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.876464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.876859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.877257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.877686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.877703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.877718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.877733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.880381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.880776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.881167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.881558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.882001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.882416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.882808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.883200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.883592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.883972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.883989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.884004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.884024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.886690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.887091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.887486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.887873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.888322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.888720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.889112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.889503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.889900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.890333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.890351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.890367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.890384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.893459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.894753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.895240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.895625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.896040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.896437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.896826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.897240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.897633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.898095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.898113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.898128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.898144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.900798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.901197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.901585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.901976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.902332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.902737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.903909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.904532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.904921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.905200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.905217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.905231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.905246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.908743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.910297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.911023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.912330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.912604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.914160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.915505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.915893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.916288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.916725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.916743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.916758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.916775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.920134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.920862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.922164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.923714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.923993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.925534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.926585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.927338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.927727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.928019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.928036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.928051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.928066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.931515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.933067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.933794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.935096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.935372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.936949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.938336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.938723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.939114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.939563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.939580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.939595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.939611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.942966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.943697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.945077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.946620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.946892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.948540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.949452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.950334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.950723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.951041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.951058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.951072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.951087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.954522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.956082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.956807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.958105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.958380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.960011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.961480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.961869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.962264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.962705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.962722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.962736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.962751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.573 [2024-07-15 13:52:34.966082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.966836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.968250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.969801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.970083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.971721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.972708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.973528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.973914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.974222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.974241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.974255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.974270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.977773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.979306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.980042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.981340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.981616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.983209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.984594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.984989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.985376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.985777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.985795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.985810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.985825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.989168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.989890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.989945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.991257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.991533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.574 [2024-07-15 13:52:34.993092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.994585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.995685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.996383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.996822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.996840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.996855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:34.996870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.000097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.001637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.003189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.003237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.003660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.004987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.006533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.008077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.009080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.009529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.009546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.009561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.009580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.011768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.011821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.011864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.011908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.012692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.014995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.015347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.015364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.015379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.015393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.017415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.017465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.017507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.017550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.017983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.018481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.020733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.021008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.021026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.021040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.021055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.023381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.023427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.023469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.023509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.023959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.024505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.026745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.027022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.027039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.027054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.836 [2024-07-15 13:52:35.027068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.028749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.028795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.028836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.028878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.029825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.031871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.031919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.031971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.032876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.034430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.034475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.034521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.034565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.034987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.035646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.037705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.037751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.037795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.037843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.038620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.040924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.041210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.041227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.041242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.041257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.043980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.044389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.046962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.837 [2024-07-15 13:52:35.049865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.049905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.049954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.050302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.050319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.050337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.050351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.051981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.052890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.054873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.054918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.054970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.055988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.056003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.056019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.057729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.057781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.057831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.057872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.058648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.060992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.061543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.063999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.064056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.064429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.064446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.064461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.064475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.066614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.067072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.067091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.067107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.067122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.069923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.070200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.070217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.070235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.838 [2024-07-15 13:52:35.070250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.071894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.071958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.071999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.072825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.075995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.076012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.076027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.076042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.077688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.077733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.077776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.077821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.078587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.080635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.080680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.080720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.080760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.081722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.083965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.084010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.084330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.084348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.084362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.084377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.086898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.087357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.087375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.087390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.087405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.089745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.090018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.090036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.090050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.090070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.091718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.091763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.091803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.091843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.092192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.092252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.092294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.839 [2024-07-15 13:52:35.092337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.092378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.092810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.092828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.092844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.092860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.094812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.094864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.094907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.094961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.095730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.097419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.097464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.099635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.100076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.100094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.100111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.100127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.102149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.102193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.102233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.103767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.104678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.109302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.110610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.111103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.111493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.111766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.113079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.114621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.116176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.117093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.117403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.117419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.117434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.117448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.119569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.119968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.120355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.121968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.122249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.123806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.125364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.126123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.127410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.127684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.127701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.127715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.127730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.132477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.132872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.134480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.136024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.136299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.137875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.138627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.139938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.141495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.141769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.840 [2024-07-15 13:52:35.141785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.141800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.141815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.144450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.146073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.147676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.149261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.149535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.150312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.151619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.153154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.154697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.155045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.155063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.155079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.155094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.162084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.163546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.164953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.166237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.166563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.168025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.168994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.169400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.169789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.170287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.170307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.170323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.170338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.173556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.174463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.175757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.177282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.177561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.178687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.180172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.180566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.180960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.181234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.181252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.181266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.181281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.185624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.186920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.188465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.190012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.190337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.190744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.191138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.191528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.191915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.192195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.192212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.192227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.192241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.195465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.197081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.198623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.199855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.200279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.201540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.201932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.202681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.203720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.204162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.204180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.204195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.204210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.207750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.208163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.209621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.210019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.210453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.211939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.212328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.212719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.841 [2024-07-15 13:52:35.213120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.213500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.213517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.213532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.213546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.216152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.217212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.217960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.218349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.218635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.219496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.219885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.220281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.220672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.221152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.221170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.221186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.221201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.224576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.224991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.226381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.226773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.227193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.227599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.228003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.228393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.228780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.229230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.229248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.229263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.229278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.231660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.232680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.233443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.233832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.234200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.234603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.235004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.235396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.235788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.236256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.236274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.236289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.236305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.239313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.239712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.240118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.240514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.241002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.241407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.241793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.242195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.242591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.242977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.242995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.243010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.243024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.245384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.245777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.246181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.246570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.247026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.247423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.247810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.248946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.252684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.253096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.253488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.253875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.254342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.254750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.255151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.256594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.256985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.257395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.257416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.257431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:55.842 [2024-07-15 13:52:35.257445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.260018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.260414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.260803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.261196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.261554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.261967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.263053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.263756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.264158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.264437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.264453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.264468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.264482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.268095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.268496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.268892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.269331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.269604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.270017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.270408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.271879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.272274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.272685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.272702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.272717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.272732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.275552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.275959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.276352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.277551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.277940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.278349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.279477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.280965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.284512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.286105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.286497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.287104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.287376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.287780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.288177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.104 [2024-07-15 13:52:35.288571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.288978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.289441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.289459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.289475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.289490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.292551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.293464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.293854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.295148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.295567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.295984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.296383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.296776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.297180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.297631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.297649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.297664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.297679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.300974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.302478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.302867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.303295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.303569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.304012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.305381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.305770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.306162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.306548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.306565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.306579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.306594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.309535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.309936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.311271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.311726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.312175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.312576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.312975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.314448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.314844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.315281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.315303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.315321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.315339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.319860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.320415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.321666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.322063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.322426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.323737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.325269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.326803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.327747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.328029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.328047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.328061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.328076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.331316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.331715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.332304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.333518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.333990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.334644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.335957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.337494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.339038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.339394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.339412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.339426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.339441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.344704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.345355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.346490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.346879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.347243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.348558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.350101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.351642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.352488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.352760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.352777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.352791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.352805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.355948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.356339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.356997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.358125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.358588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.359309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.360611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.362150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.363694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.364022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.364039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.364054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.364069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.369312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.370031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.371117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.105 [2024-07-15 13:52:35.371504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.371860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.373171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.374707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.376253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.377049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.377320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.377336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.377351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.377365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.380464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.380861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.381576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.382656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.383110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.383857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.385159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.386697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.388246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.388562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.388579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.388594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.388609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.393835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.394529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.395639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.396032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.396387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.397695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.399240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.400767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.401607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.401879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.401895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.401914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.401934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.405058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.405447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.406084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.407238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.407706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.408419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.409559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.411082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.412613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.412889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.412905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.412920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.412942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.418014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.418477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.418524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.419702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.420144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.420790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.422099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.423639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.425177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.425492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.425509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.425524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.425539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.427892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.429498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.429889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.429945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.430355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.431817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.432211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.433043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.434335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.434606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.434623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.434637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.434651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.439823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.440279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.440298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.440313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.106 [2024-07-15 13:52:35.440327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.442994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.443403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.447896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.447951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.447992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.448996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.449011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.449026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.451731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.452006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.452023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.452037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.452055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.456595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.456644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.456684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.456724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.457594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.459901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.459953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.459997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.460949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.465589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.465648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.465689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.465738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.466514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.468691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.468737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.468783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.468825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.469767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.107 [2024-07-15 13:52:35.474860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.474909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.474959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.475006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.475345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.475362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.475376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.475390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.477412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.477457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.477499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.477541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.477985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.478486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.482604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.482663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.482704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.482744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.483630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.485596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.485641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.485685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.485726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.486677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.490846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.491176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.491192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.491208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.491222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.493981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.494346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.494362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.494376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.494390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.498913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.499193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.108 [2024-07-15 13:52:35.499209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.499225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.499239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.501992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.502532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.506782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.506832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.506873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.506913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.507711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.509822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.509866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.509906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.509952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.510989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.511005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.511019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.515755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.515805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.515845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.515885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.516660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.518792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.518836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.518879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.518920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.519889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.524676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.524732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.524772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.524812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.109 [2024-07-15 13:52:35.525593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.370 [2024-07-15 13:52:35.527804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.527856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.527900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.527946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.528875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.533861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.534145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.534162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.534176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.534190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.536788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.537253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.537270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.537285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.537301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.541973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.542018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.542290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.542306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.542321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.542335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.544879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.545373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.545390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.545406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.545421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.549842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.550117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.550134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.550148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.550163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.552761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.553198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.371 [2024-07-15 13:52:35.553215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.553229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.553245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.556874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.556923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.556970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.557849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.559725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.559771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.559813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.559858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.560805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.564466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.564522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.565885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.565934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.566711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.570010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.570060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.570104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.570688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.570969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.571478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.576699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.578241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.578633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.579241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.579518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.579923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.580568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.581860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.583406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.583684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.583700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.583715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.583729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.589683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.590095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.590668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.591888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.592381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.593075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.594368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.595888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.597419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.597741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.597758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.597773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.597788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.602970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.603550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.604759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.605152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.605568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.607117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.608656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.609817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.610975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.611250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.611266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.611280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.611295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.615424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.616015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.617210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.618744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.619024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.372 [2024-07-15 13:52:35.620674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.621734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.623039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.624575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.624851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.624867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.624882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.624896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.628301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.629698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.631238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.632777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.633059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.634108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.635405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.636945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.638477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.638881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.638898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.638916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.638936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.645237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.646816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.648329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.649328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.649659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.650955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.652518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.654060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.654759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.655043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.655060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.655074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.655088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.658667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.660079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.660465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.661207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.661485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.661889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.662287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.662682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.663498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.663779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.663795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.663810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.663825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.667403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.668305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.668699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.669956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.670363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.670770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.671163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.671555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.672949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.673379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.673396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.673410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.673425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.677718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.678125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.678515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.680008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.680462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.680864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.681264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.681679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.683060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.683543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.683560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.683576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.683591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.688048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.688450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.689390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.690240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.690674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.691081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.691473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.692493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.693269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.693723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.693742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.693757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.693772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.698125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.698522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.700087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.700486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.700952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.701353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.701756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.703393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.703782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.704231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.704249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.704263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.704277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.708617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.373 [2024-07-15 13:52:35.709119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.710419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.710808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.711241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.711647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.712251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.713448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.713840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.714212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.714230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.714249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.714264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.717993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.719133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.719785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.720178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.720562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.720973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.722257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.722765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.723158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.723436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.723452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.723467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.723482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.726694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.728327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.728723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.729119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.729544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.729959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.731598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.731998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.732390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.732665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.732682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.732697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.732711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.736105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.737483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.737878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.738279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.738706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.739276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.740511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.740900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.741732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.742018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.742036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.742051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.742066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.746256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.747263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.747653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.748048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.748430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.749259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.750232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.750620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.751704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.752063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.752082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.752097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.752112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.757031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.757702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.758096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.758486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.758860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.760094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.760669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.761061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.762572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.763049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.763067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.763083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.763098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.768957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.769357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.769746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.770150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.770543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.772180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.772566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.773001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.774363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.774845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.774862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.774878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.774892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.780230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.780623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.781022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.781422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.781787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.783211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.783599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.784260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.785376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.374 [2024-07-15 13:52:35.785836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.785854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.785870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.785890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.790558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.790966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.791518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.792741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.375 [2024-07-15 13:52:35.793095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.794389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.795583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.796898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.800185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.801469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.801983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.802371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.802779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.803191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.804542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.804988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.805375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.805656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.805672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.805686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.805700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.808884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.810469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.810864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.811255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.811727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.812138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.813637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.814974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.819624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.820917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.822455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.823956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.824328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.825819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.826216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.826603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.828133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.828624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.828641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.828656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.828671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.833788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.835338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.836882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.837583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.837861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.838326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.838714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.840353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.840753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.841193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.841210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.841224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.841238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.847126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.848663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.849533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.851191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.851677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.852086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.853697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.854963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.860654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.861673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.863212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.863612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.864057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.865562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.865961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.866379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.867745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.868024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.868040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.868055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.868069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.873673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.875139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.636 [2024-07-15 13:52:35.875535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.875929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.876204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.876698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.877091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.878645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.880279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.880553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.880569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.880583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.880597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.886605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.887104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.887493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.889144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.889600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.890007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.891602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.893223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.894767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.895046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.895063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.895078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.895092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.900036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.900442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.902094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.902487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.902930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.904564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.906174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.907744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.909166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.909491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.909507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.909521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.909535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.913813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.915359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.915759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.916227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.916505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.917975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.919597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.921146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.922256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.922574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.922590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.922605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.922619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.929674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.930081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.930132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.930959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.931318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.932882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.934415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.935584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.937078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.937415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.937431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.937446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.937460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.943852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.944254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.945128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.945177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.945520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.947090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.948625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.949562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.951197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.951474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.951491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.951505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.951519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.956914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.957374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.957392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.637 [2024-07-15 13:52:35.957407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.957422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.962830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.963110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.963127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.963141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.963155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.966505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.966555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.966597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.966638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.966956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.967528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.972992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.973009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.973024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.973038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.975691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.975749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.975790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.975830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.976618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.980698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.980748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.980788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.980829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.981850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.986919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.986974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.987847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.991703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.991754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.991798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.991838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.992257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.638 [2024-07-15 13:52:35.992311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.992772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.996981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.997302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.997319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.997334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:35.997349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.000705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.001040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.001058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.001073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.001088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.006917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.007367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.007384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.007400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.007415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.012913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.013191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.013207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.013222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.013236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.016980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.017253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.017277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.017292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.017306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.021417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.021467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.021508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.021551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.021987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.022690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.639 [2024-07-15 13:52:36.026973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.027373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.031637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.031686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.031735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.031776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.032587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.037453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.037503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.037544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.037597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.038744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.042985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.043386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.047693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.047747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.047788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.047829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.048609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.640 [2024-07-15 13:52:36.053887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.053936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.053980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.054023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.054453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.054471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.054491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.054507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.058240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.058298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.058341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.641 [2024-07-15 13:52:36.058382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.058738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.058798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.058840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.058881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.058923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.059260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.059276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.059291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.059305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.063964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.064013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.064289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.064305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.064319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.064334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.068911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.068966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.069619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.070090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.070109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.070124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.070139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.073755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.073804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.073849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.073889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.074894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.078860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.078910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.078957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.078999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.079983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.084792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.084842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.084897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.084945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.085874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.089906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.089959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.090009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.090049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.090374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.090431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.903 [2024-07-15 13:52:36.090472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.090924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.095794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.096068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.096084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.096099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.096113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.101844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.102246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.102263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.102280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.102294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.105999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.106942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.111167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.111218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.112766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.112827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.113627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.118997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.119609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.125467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.127039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.128590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.128999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.129490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.129890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.130283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.130672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.131076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.131541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.131558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.131573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.131587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.135001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.135397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.135783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.136176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.136548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.136958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.137345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.137730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.138121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.138549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.138567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.138581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.138600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.904 [2024-07-15 13:52:36.142142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.142542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.142931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.143320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.143764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.144194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.144596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.144987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.145371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.145806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.145824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.145839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.145855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.149318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.149718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.150113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.150501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.151001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.151400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.151791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.152190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.152598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.153036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.153054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.153069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.153084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.156635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.157041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.157444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.157838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.158320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.158721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.159114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.159505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.159899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.160310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.160327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.160343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.160357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.163752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.164152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.164540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.164932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.165312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.165715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.166106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.166493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.166879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.167282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.167299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.167315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.167329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.171018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.171415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.171800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.172189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.172607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.173025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.173420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.173808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.174200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.174640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.174657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.174672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.174687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.178172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.178577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.178976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.179366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.179796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.180200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.180597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.180995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.181390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.181827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.181847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.181864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.181880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.185523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.185932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.186332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.186740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.187194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.187598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.187999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.188389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.188786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.189184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.189202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.189218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.189232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.192689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.193094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.193484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.193872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.194230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.194634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.195025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.195413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.195799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.905 [2024-07-15 13:52:36.196212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.196230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.196245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.196260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.199887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.200301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.200689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.201083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.201505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.203169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.203687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.204949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.205339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.205783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.205800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.205816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.205831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.209491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.209901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.210303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.210697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.211177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.211577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.211972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.212363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.212765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.213163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.213181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.213196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.213211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.216711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.217113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.217501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.218888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.219197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.220759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.222295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.223018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.224309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.224581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.224598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.224612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.224627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.230122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.231418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.232964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.234564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.234955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.236259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.237782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.239323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.240391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.240830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.240847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.240861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.240876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.246250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.246968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.248256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.249796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.250078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.251535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.251921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.252310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.252696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.253135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.253153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.253169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.253184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.255592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.257087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.258632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.260167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.260439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.260845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.261238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.261625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.262020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.262337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.262353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.262368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.262382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.265559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.267121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.268756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.270198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.270607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.271010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.271397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.271783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.272990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.273294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.273310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.906 [2024-07-15 13:52:36.273325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.273339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.276255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.277791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.279400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.279796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.280257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.280657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.281050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.281993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.283298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.283571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.283587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.283602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.283616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.286793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.288347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.288746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.289139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.289536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.289944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.290748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.292060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.293586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.293861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.293877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.293891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.293905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.297080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.297617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.298012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.298399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.298849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.299462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.300764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.302303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.303852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.304191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.304210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.304225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.304239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.306527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.306919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.307311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.307696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.308111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.309513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.311050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.312589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.313738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.314019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.314040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.314055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.314069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.316133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.316521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.316906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.317459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.317730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.319252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.320870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.322357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.323544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.323859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.323875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.323890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:56.907 [2024-07-15 13:52:36.323904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.326125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.326532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.326969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.328316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.328589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.330132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.331756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.332802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.334103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.334376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.334392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.334406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.334421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.336866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.337268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.171 [2024-07-15 13:52:36.338701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.340239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.340511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.342166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.343153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.344450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.345999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.346273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.346289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.346303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.346318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.348950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.350382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.351915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.353449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.353722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.354714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.356021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.357560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.359103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.359492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.359509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.359524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.359539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.363493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.365043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.366580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.367688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.367971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.369284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.370828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.372377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.372797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.373263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.373282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.373298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.373313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.376989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.378507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.379679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.381152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.381490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.383046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.384588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.385920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.389409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.390657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.392131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.393517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.393794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.395363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.395889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.396286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.396676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.397133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.397152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.397172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.397188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.400188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.401546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.402844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.404385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.404662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.405336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.405724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.406949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.409533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.410816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.412365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.413908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.414354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.414765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.415162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.415547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.416309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.416614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.416632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.416646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.416661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.419632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.421171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.421223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.422760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.423256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.423663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.424061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.424449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.425373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.172 [2024-07-15 13:52:36.425683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.425700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.425714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.425729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.428653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.430196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.431736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.431785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.432286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.432691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.433085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.433475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.434410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.434713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.434729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.434744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.434757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.436974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.437019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.437292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.437309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.437324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.437338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.439582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.439630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.439671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.439712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.440666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.442949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.443221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.443237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.443256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.443270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.445960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.446407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.448989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.450990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.451776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.452166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.452183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.452198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.452212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.453787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.453833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.453874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.453920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.173 [2024-07-15 13:52:36.454198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.454708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.456648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.456694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.456740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.456783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.457910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.459988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.460485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.462988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.463607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.465956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.466230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.466246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.466260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.466275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.468919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.469370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.469388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.469404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.469419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.471769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.472042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.472059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.472074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.472088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.474975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.174 [2024-07-15 13:52:36.475427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.477997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.478015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.478029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.478044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.480959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.481022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.481291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.481307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.481321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.481336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.482996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.483937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.486999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.487306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.487323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.487337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.487352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.489915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.491833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.491881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.491934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.491987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.492490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.492543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.492586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.492627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.492670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.493076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.493093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.493108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.493122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.494667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.494713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.494753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.494795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.495648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.497568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.497615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.497656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.497698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.175 [2024-07-15 13:52:36.498116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.498782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.501854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.502252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.502269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.502284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.502298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.504699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.504746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.504790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.504832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.505935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.508981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.509493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.512935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.513525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.515861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.515907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.515955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.516606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.517059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.517077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.517093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.517109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.519391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.519448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.519493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.519533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.519970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.176 [2024-07-15 13:52:36.520597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.520613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.520628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.523784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.524192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.524213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.524228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.524243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.526480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.526526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.526568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.526610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.527765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.530103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.530150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.530541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.530584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.531607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.533941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.533995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.534037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.534429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.534812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.534876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.534949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.535510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.538253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.538648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.539045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.539432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.539864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.540273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.540668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.541968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.544628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.545029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.545419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.177 [2024-07-15 13:52:36.545816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.546218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.546623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.547022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.547409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.547800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.548257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.548274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.548290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.548304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.551111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.551510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.551898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.552296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.552733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.553139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.553531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.553923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.554322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.554761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.554778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.554794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.554809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.557496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.557889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.558287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.558676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.559041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.559459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.559849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.560241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.560629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.561077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.561101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.561117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.561132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.563901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.564307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.564700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.565095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.565525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.565923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.566319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.566709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.567110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.567500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.567517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.567533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.567548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.570219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.570612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.571007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.571393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.571830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.572239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.572632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.573937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.576432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.576826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.577222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.577608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.578053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.578466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.578864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.579262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.579650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.178 [2024-07-15 13:52:36.580117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.580136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.580152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.580170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.582817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.583222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.583615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.584015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.584498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.584900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.585291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.585675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.586085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.586463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.586481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.586496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.586512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.589482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.589885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.590280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.590670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.591135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.179 [2024-07-15 13:52:36.591890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.593201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.594744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.596295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.596623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.596640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.596656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.596671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.598850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.599251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.599637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.600041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.600420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.601736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.603301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.604842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.605800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.606083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.606100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.606114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.606129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.608127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.608525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.608913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.609540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.609813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.611459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.613024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.614353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.615688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.616013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.616030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.616049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.616063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.618269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.440 [2024-07-15 13:52:36.618669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.619218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.620526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.620801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.622435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.623888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.625130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.626437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.626709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.626725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.626740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.626755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.629355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.629749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.631285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.632871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.633152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.634730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.635670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.636976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.638520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.638799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.638815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.638830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.638844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.641544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.643069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.644642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.646196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.646469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.647388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.648686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.650232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.651782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.652162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.652181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.652195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.652210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.656640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.658232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.659844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.661265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.661593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.662901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.664451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.666011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.666755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.667246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.667265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.667281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.667296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.670878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.672540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.673977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.675216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.675541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.677102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.678621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.679424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.679823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.680275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.680293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.680309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.680325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.684114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.685748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.686719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.688033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.688304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.689874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.690903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.691298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.691686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.692174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.692193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.692208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.692224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.695471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.696257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.697566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.699118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.699391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.700616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.701016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.701408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.701797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.702258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.702276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.702292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.702313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.704564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.705972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.707517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.709064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.709337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.709740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.710135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.710518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.710906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.711209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.441 [2024-07-15 13:52:36.711227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.711242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.711256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.714494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.716085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.717660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.719048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.719412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.719818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.720245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.720639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.721923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.722259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.722275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.722290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.722305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.725236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.726788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.728412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.728809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.729244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.729642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.730038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.731077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.732361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.732638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.732655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.732670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.732684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.735846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.737405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.737805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.738203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.738589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.739003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.739685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.740989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.742523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.742798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.742815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.742829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.742844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.746075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.746734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.747130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.747518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.748023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.748489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.749817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.751361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.752909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.753200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.753217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.753232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.753250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.755707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.756139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.756531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.756933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.757376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.758999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.760545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.762180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.763587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.763910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.763934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.763950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.763964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.765944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.766335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.766721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.767117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.767392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.768707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.770265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.771879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.772905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.773250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.773267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.773282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.773301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.775453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.775848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.776248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.777822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.778136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.779703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.781240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.781956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.783255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.783530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.783546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.783561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.783575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.785906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.786308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.787765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.789160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.789434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.791026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.442 [2024-07-15 13:52:36.791733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.793050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.794582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.794859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.794876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.794891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.794905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.797481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.798801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.800118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.801667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.801952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.802677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.804003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.805548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.807106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.807389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.807406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.807421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.807438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.811322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.812665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.814218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.815839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.816246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.817545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.819091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.820632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.821515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.821965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.821983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.821998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.822014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.825498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.827051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.828626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.829522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.829836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.831403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.832950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.834964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.838547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.840088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.840144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.841153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.841477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.843036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.844566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.845585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.845985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.846429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.846447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.846463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.846478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.850018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.851565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.852270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.852318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.852657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.854224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.855765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.856936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.857329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.857776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.857794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.857810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.857829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.859898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.859952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.859994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.860829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.862422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.862466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.862506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.862547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.862943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.443 [2024-07-15 13:52:36.863652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.865781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.865829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.865869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.865909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.866727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.868440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.868486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.868526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.868567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.869651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.871959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.872505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.874831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.875273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.875291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.875305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.875319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.877974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.878468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.880789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.881213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.881231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.881247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.881262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.883856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.884129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.884146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.884161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.884175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.885870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.885914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.885961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.707 [2024-07-15 13:52:36.886542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.887011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.887029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.887045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.887063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.889822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.890094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.890111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.890125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.890140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.891837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.891881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.891931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.891974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.892914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.895799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.896078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.896095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.896110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.896124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.897847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.897894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.897940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.897981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.898791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.901941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.902349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.902365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.902380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.902395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.904677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.904722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.904767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.904808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.905949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.908988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.909428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.909445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.909462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.909478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.911832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.911877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.911917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.911981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.912476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.912529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.912571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.912614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.912654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.913065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.913082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.913097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.913112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.915434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.915479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.915520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.915561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.915988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.916640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.918895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.918946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.918992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.919676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.920139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.920157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.920173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.920188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.922571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.922619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.922661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.922703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.923750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.926939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.927363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.927380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.927395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.927410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.929721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.929765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.929806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.929848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.930289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.708 [2024-07-15 13:52:36.930340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.930954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.933948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.934500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.936828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.936883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.936924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.936971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.937987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.940774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.940819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.940873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.940937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.941351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.941415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.941473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.941517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.941559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.942026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.942043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.942065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.942080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.944973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.945662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.947930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.947988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.948679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.949116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.949134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.949149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.949164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.951459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.951504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.951545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.951600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.952722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.955784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.956196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.956214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.956229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.956244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.958495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.958540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.958935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.958993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.959470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.959532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.959574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.959616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.959661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.960075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.960093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.960108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.960123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.962491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.962535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.962575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.962967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.963917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.966888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.967306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.967701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.968091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.968568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.968971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.969367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.969760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.970155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.970605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.970621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.970636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.970655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.973383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.973774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.974165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.974555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.974941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.709 [2024-07-15 13:52:36.975346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.975736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.976130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.976519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.976958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.976976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.976991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.977006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.979786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.980186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.980582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.980980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.981429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.981830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.982228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.982893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.984016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.984388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.984404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.984418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.984433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.987028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.987423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.987816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.988216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.988796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.989208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.989598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.989999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.990390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.990834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.990852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.990868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.990884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.993495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.993891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.994284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.994673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.995019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.995422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.995816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.996213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.996603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.997048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.997067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.997084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.997100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:36.999822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.000219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.001715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.003143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.003417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.004987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.005712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.007020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.008573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.008846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.008863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.008878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.008892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.011530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.012983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.014365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.015907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.016187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.016930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.018236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.019786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.021333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.021629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.021646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.021661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.021676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.025254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.026569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.028118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.029655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.030171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.031511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.033067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.034610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.035629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.036060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.036078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.036093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.036107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.039664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.041244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.042829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.043966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.044287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.045858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.047390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.048229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.048628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.049083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.049101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.049119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.049134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.052728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.054277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.055149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.056449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.056724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.058292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.059462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.059851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.060248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.060654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.060672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.060687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.060702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.064041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.064784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.066091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.067639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.067920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.069313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.069703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.070979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.073677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.075243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.076733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.078354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.078630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.079088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.079480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.079870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.080269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.080664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.080681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.080695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.080710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.083521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.084833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.086368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.087899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.088299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.088706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.089103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.089495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.090128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.090408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.090425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.090440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.090455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.093348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.094900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.096449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.097217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.097651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.098061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.098453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.098843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.100492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.100766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.100782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.100797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.100811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.104056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.105602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.710 [2024-07-15 13:52:37.106620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.107018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.107458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.107860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.108254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.109698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.111066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.111339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.111355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.111370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.111384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.114682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.116122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.116514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.116906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.117310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.117715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.118807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.120093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.121626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.121900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.121916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.121936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.121951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.125245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.125649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.126047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.126439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.126882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.711 [2024-07-15 13:52:37.127712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.129012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.130535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.132081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.132426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.132442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.132457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.132472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.134631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.135028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.135421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.135812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.136242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.137704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.139246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.140779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.141969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.142257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.142273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.142288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.142302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.144408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.144815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.145216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.145784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.146064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.147600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.149231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.150711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.151924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.152257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.152273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.152288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.152302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.154447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.154845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.155242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.156871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.157154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.158696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.160241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.160990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.162304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.162577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.162598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.162613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.162627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.165119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.165513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.167021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.168446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.168723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.170294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.171014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.172315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.173855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.174133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.174150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.174165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.174179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.176672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.177398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.178696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.180238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.970 [2024-07-15 13:52:37.180513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:58.537 00:33:58.537 Latency(us) 00:33:58.537 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:58.537 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.537 Verification LBA range: start 0x0 length 0x100 00:33:58.537 crypto_ram : 6.05 42.33 2.65 0.00 0.00 2938050.11 262599.90 2494699.07 00:33:58.537 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.537 Verification LBA range: start 0x100 length 0x100 00:33:58.537 crypto_ram : 5.96 42.98 2.69 0.00 0.00 2886885.73 317308.22 2319632.47 00:33:58.537 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.537 Verification LBA range: start 0x0 length 0x100 00:33:58.537 crypto_ram1 : 6.05 42.32 2.65 0.00 0.00 2840650.80 260776.29 2305043.59 00:33:58.537 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.537 Verification LBA range: start 0x100 length 0x100 00:33:58.537 crypto_ram1 : 5.96 42.98 2.69 0.00 0.00 2795527.12 317308.22 2144565.87 00:33:58.537 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.537 Verification LBA range: start 0x0 length 0x100 00:33:58.537 crypto_ram2 : 5.64 266.26 16.64 0.00 0.00 430768.36 2621.44 620027.55 00:33:58.538 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.538 Verification LBA range: start 0x100 length 0x100 00:33:58.538 crypto_ram2 : 5.61 286.07 17.88 0.00 0.00 402117.61 45818.21 609085.89 00:33:58.538 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:58.538 Verification LBA range: start 0x0 length 0x100 00:33:58.538 crypto_ram3 : 5.69 271.51 16.97 0.00 0.00 409869.21 5356.86 379310.97 00:33:58.538 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:58.538 Verification LBA range: start 0x100 length 0x100 00:33:58.538 crypto_ram3 : 5.67 293.25 18.33 0.00 0.00 382639.91 66105.88 470491.49 00:33:58.538 =================================================================================================================== 00:33:58.538 Total : 1287.70 80.48 0.00 0.00 748971.40 2621.44 2494699.07 00:33:59.105 00:33:59.105 real 0m9.251s 00:33:59.105 user 0m17.514s 00:33:59.105 sys 0m0.478s 00:33:59.105 13:52:38 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:59.105 13:52:38 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:59.105 ************************************ 00:33:59.105 END TEST bdev_verify_big_io 00:33:59.105 ************************************ 00:33:59.105 13:52:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:59.105 13:52:38 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:59.105 13:52:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:59.105 13:52:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:59.105 13:52:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:59.105 ************************************ 00:33:59.105 START TEST bdev_write_zeroes 00:33:59.105 ************************************ 00:33:59.105 13:52:38 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:59.105 [2024-07-15 13:52:38.348825] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:33:59.105 [2024-07-15 13:52:38.348883] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2274719 ] 00:33:59.105 [2024-07-15 13:52:38.476563] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:59.364 [2024-07-15 13:52:38.577415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:59.364 [2024-07-15 13:52:38.598716] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:59.364 [2024-07-15 13:52:38.606744] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:59.364 [2024-07-15 13:52:38.614762] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:59.364 [2024-07-15 13:52:38.726060] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:01.895 [2024-07-15 13:52:40.937947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:01.895 [2024-07-15 13:52:40.938021] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:01.895 [2024-07-15 13:52:40.938038] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:01.896 [2024-07-15 13:52:40.945964] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:01.896 [2024-07-15 13:52:40.945989] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:01.896 [2024-07-15 13:52:40.946001] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:01.896 [2024-07-15 13:52:40.953981] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:01.896 [2024-07-15 13:52:40.953999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:01.896 [2024-07-15 13:52:40.954010] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:01.896 [2024-07-15 13:52:40.962001] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:01.896 [2024-07-15 13:52:40.962018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:01.896 [2024-07-15 13:52:40.962030] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:01.896 Running I/O for 1 seconds... 00:34:02.835 00:34:02.835 Latency(us) 00:34:02.835 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:02.835 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:02.835 crypto_ram : 1.02 2013.88 7.87 0.00 0.00 63145.11 5556.31 76135.74 00:34:02.835 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:02.835 crypto_ram1 : 1.03 2027.00 7.92 0.00 0.00 62450.82 5584.81 70664.90 00:34:02.835 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:02.835 crypto_ram2 : 1.02 15559.64 60.78 0.00 0.00 8108.50 2450.48 10656.72 00:34:02.835 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:02.835 crypto_ram3 : 1.02 15538.33 60.70 0.00 0.00 8082.99 2436.23 8434.20 00:34:02.835 =================================================================================================================== 00:34:02.835 Total : 35138.85 137.26 0.00 0.00 14412.07 2436.23 76135.74 00:34:03.094 00:34:03.094 real 0m4.163s 00:34:03.094 user 0m3.728s 00:34:03.094 sys 0m0.390s 00:34:03.094 13:52:42 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:03.094 13:52:42 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:03.094 ************************************ 00:34:03.094 END TEST bdev_write_zeroes 00:34:03.094 ************************************ 00:34:03.094 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:03.094 13:52:42 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:03.094 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:03.094 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:03.094 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.352 ************************************ 00:34:03.352 START TEST bdev_json_nonenclosed 00:34:03.352 ************************************ 00:34:03.352 13:52:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:03.352 [2024-07-15 13:52:42.594920] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:03.352 [2024-07-15 13:52:42.594983] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2275258 ] 00:34:03.352 [2024-07-15 13:52:42.719976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:03.611 [2024-07-15 13:52:42.817061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:03.611 [2024-07-15 13:52:42.817129] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:03.611 [2024-07-15 13:52:42.817150] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:03.611 [2024-07-15 13:52:42.817162] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:03.611 00:34:03.611 real 0m0.383s 00:34:03.611 user 0m0.232s 00:34:03.611 sys 0m0.149s 00:34:03.611 13:52:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:03.611 13:52:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:03.611 13:52:42 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:03.611 ************************************ 00:34:03.611 END TEST bdev_json_nonenclosed 00:34:03.611 ************************************ 00:34:03.611 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:03.611 13:52:42 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:34:03.611 13:52:42 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:03.611 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:03.611 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:03.611 13:52:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.611 ************************************ 00:34:03.611 START TEST bdev_json_nonarray 00:34:03.611 ************************************ 00:34:03.611 13:52:42 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:03.870 [2024-07-15 13:52:43.055158] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:03.870 [2024-07-15 13:52:43.055217] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2275280 ] 00:34:03.870 [2024-07-15 13:52:43.173054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:03.870 [2024-07-15 13:52:43.273400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:03.870 [2024-07-15 13:52:43.273478] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:03.870 [2024-07-15 13:52:43.273499] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:03.870 [2024-07-15 13:52:43.273512] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:04.129 00:34:04.129 real 0m0.383s 00:34:04.129 user 0m0.239s 00:34:04.129 sys 0m0.141s 00:34:04.129 13:52:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:04.129 13:52:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.129 13:52:43 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:04.129 ************************************ 00:34:04.129 END TEST bdev_json_nonarray 00:34:04.129 ************************************ 00:34:04.129 13:52:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:34:04.129 13:52:43 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:34:04.129 00:34:04.129 real 1m11.795s 00:34:04.129 user 2m39.967s 00:34:04.129 sys 0m8.959s 00:34:04.129 13:52:43 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.129 13:52:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:04.129 ************************************ 00:34:04.129 END TEST blockdev_crypto_qat 00:34:04.129 ************************************ 00:34:04.129 13:52:43 -- common/autotest_common.sh@1142 -- # return 0 00:34:04.129 13:52:43 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:04.129 13:52:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:04.129 13:52:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.129 13:52:43 -- common/autotest_common.sh@10 -- # set +x 00:34:04.129 ************************************ 00:34:04.129 START TEST chaining 00:34:04.129 ************************************ 00:34:04.129 13:52:43 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:04.388 * Looking for test storage... 00:34:04.388 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:04.388 13:52:43 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@7 -- # uname -s 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:04.388 13:52:43 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:04.388 13:52:43 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:04.388 13:52:43 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:04.388 13:52:43 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:04.388 13:52:43 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:04.388 13:52:43 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:04.388 13:52:43 chaining -- paths/export.sh@5 -- # export PATH 00:34:04.388 13:52:43 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@47 -- # : 0 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:34:04.388 13:52:43 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:34:04.389 13:52:43 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:04.389 13:52:43 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:04.389 13:52:43 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:04.389 13:52:43 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:04.389 13:52:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@336 -- # return 1 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:12.543 13:52:50 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:12.544 WARNING: No supported devices were found, fallback requested for tcp test 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:12.544 Cannot find device "nvmf_tgt_br" 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@155 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:12.544 Cannot find device "nvmf_tgt_br2" 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@156 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:12.544 Cannot find device "nvmf_tgt_br" 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@158 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:12.544 Cannot find device "nvmf_tgt_br2" 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@159 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:12.544 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@162 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:12.544 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@163 -- # true 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:12.544 13:52:50 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:12.544 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:12.544 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.100 ms 00:34:12.544 00:34:12.544 --- 10.0.0.2 ping statistics --- 00:34:12.544 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:12.544 rtt min/avg/max/mdev = 0.100/0.100/0.100/0.000 ms 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:12.544 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:12.544 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:34:12.544 00:34:12.544 --- 10.0.0.3 ping statistics --- 00:34:12.544 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:12.544 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:12.544 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:12.544 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.041 ms 00:34:12.544 00:34:12.544 --- 10.0.0.1 ping statistics --- 00:34:12.544 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:12.544 rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@433 -- # return 0 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:12.544 13:52:51 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@481 -- # nvmfpid=2278968 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@482 -- # waitforlisten 2278968 00:34:12.544 13:52:51 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@829 -- # '[' -z 2278968 ']' 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:12.544 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:12.544 13:52:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:12.544 [2024-07-15 13:52:51.425980] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:12.544 [2024-07-15 13:52:51.426050] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:12.544 [2024-07-15 13:52:51.553133] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:12.544 [2024-07-15 13:52:51.656743] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:12.544 [2024-07-15 13:52:51.656792] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:12.544 [2024-07-15 13:52:51.656806] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:12.544 [2024-07-15 13:52:51.656819] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:12.544 [2024-07-15 13:52:51.656830] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:12.544 [2024-07-15 13:52:51.656860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:13.111 13:52:52 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.111 13:52:52 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.pA93aYHQS6 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.uqOAKm7De5 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.111 malloc0 00:34:13.111 true 00:34:13.111 true 00:34:13.111 [2024-07-15 13:52:52.451255] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:13.111 crypto0 00:34:13.111 [2024-07-15 13:52:52.459282] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:13.111 crypto1 00:34:13.111 [2024-07-15 13:52:52.467403] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:13.111 [2024-07-15 13:52:52.483641] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@85 -- # update_stats 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:13.111 13:52:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.111 13:52:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:13.370 13:52:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.pA93aYHQS6 bs=1K count=64 00:34:13.370 64+0 records in 00:34:13.370 64+0 records out 00:34:13.370 65536 bytes (66 kB, 64 KiB) copied, 0.000697287 s, 94.0 MB/s 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.pA93aYHQS6 --ob Nvme0n1 --bs 65536 --count 1 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@25 -- # local config 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:13.370 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:13.370 "subsystems": [ 00:34:13.370 { 00:34:13.370 "subsystem": "bdev", 00:34:13.370 "config": [ 00:34:13.370 { 00:34:13.370 "method": "bdev_nvme_attach_controller", 00:34:13.370 "params": { 00:34:13.370 "trtype": "tcp", 00:34:13.370 "adrfam": "IPv4", 00:34:13.370 "name": "Nvme0", 00:34:13.370 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:13.370 "traddr": "10.0.0.2", 00:34:13.370 "trsvcid": "4420" 00:34:13.370 } 00:34:13.370 }, 00:34:13.370 { 00:34:13.370 "method": "bdev_set_options", 00:34:13.370 "params": { 00:34:13.370 "bdev_auto_examine": false 00:34:13.370 } 00:34:13.370 } 00:34:13.370 ] 00:34:13.370 } 00:34:13.370 ] 00:34:13.370 }' 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.pA93aYHQS6 --ob Nvme0n1 --bs 65536 --count 1 00:34:13.370 13:52:52 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:13.370 "subsystems": [ 00:34:13.370 { 00:34:13.370 "subsystem": "bdev", 00:34:13.370 "config": [ 00:34:13.370 { 00:34:13.370 "method": "bdev_nvme_attach_controller", 00:34:13.370 "params": { 00:34:13.370 "trtype": "tcp", 00:34:13.370 "adrfam": "IPv4", 00:34:13.370 "name": "Nvme0", 00:34:13.370 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:13.370 "traddr": "10.0.0.2", 00:34:13.370 "trsvcid": "4420" 00:34:13.370 } 00:34:13.370 }, 00:34:13.370 { 00:34:13.370 "method": "bdev_set_options", 00:34:13.370 "params": { 00:34:13.370 "bdev_auto_examine": false 00:34:13.370 } 00:34:13.370 } 00:34:13.370 ] 00:34:13.370 } 00:34:13.370 ] 00:34:13.370 }' 00:34:13.630 [2024-07-15 13:52:52.802030] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:13.630 [2024-07-15 13:52:52.802092] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279191 ] 00:34:13.630 [2024-07-15 13:52:52.930171] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:13.630 [2024-07-15 13:52:53.030282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:14.147  Copying: 64/64 [kB] (average 20 MBps) 00:34:14.147 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.147 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.147 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.406 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.406 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.406 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:14.406 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:34:14.406 13:52:53 chaining -- bdev/chaining.sh@96 -- # update_stats 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:14.407 13:52:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.407 13:52:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.uqOAKm7De5 --ib Nvme0n1 --bs 65536 --count 1 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@25 -- # local config 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:14.666 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:14.666 "subsystems": [ 00:34:14.666 { 00:34:14.666 "subsystem": "bdev", 00:34:14.666 "config": [ 00:34:14.666 { 00:34:14.666 "method": "bdev_nvme_attach_controller", 00:34:14.666 "params": { 00:34:14.666 "trtype": "tcp", 00:34:14.666 "adrfam": "IPv4", 00:34:14.666 "name": "Nvme0", 00:34:14.666 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:14.666 "traddr": "10.0.0.2", 00:34:14.666 "trsvcid": "4420" 00:34:14.666 } 00:34:14.666 }, 00:34:14.666 { 00:34:14.666 "method": "bdev_set_options", 00:34:14.666 "params": { 00:34:14.666 "bdev_auto_examine": false 00:34:14.666 } 00:34:14.666 } 00:34:14.666 ] 00:34:14.666 } 00:34:14.666 ] 00:34:14.666 }' 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.uqOAKm7De5 --ib Nvme0n1 --bs 65536 --count 1 00:34:14.666 13:52:53 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:14.666 "subsystems": [ 00:34:14.666 { 00:34:14.666 "subsystem": "bdev", 00:34:14.666 "config": [ 00:34:14.666 { 00:34:14.666 "method": "bdev_nvme_attach_controller", 00:34:14.666 "params": { 00:34:14.666 "trtype": "tcp", 00:34:14.666 "adrfam": "IPv4", 00:34:14.666 "name": "Nvme0", 00:34:14.666 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:14.666 "traddr": "10.0.0.2", 00:34:14.666 "trsvcid": "4420" 00:34:14.666 } 00:34:14.666 }, 00:34:14.666 { 00:34:14.666 "method": "bdev_set_options", 00:34:14.666 "params": { 00:34:14.666 "bdev_auto_examine": false 00:34:14.666 } 00:34:14.666 } 00:34:14.666 ] 00:34:14.666 } 00:34:14.666 ] 00:34:14.666 }' 00:34:14.666 [2024-07-15 13:52:53.951072] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:14.666 [2024-07-15 13:52:53.951140] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279402 ] 00:34:14.666 [2024-07-15 13:52:54.079373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:14.925 [2024-07-15 13:52:54.176863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:15.182  Copying: 64/64 [kB] (average 20 MBps) 00:34:15.182 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:15.182 13:52:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:15.182 13:52:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:15.182 13:52:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:15.182 13:52:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:15.440 13:52:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.pA93aYHQS6 /tmp/tmp.uqOAKm7De5 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@25 -- # local config 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:15.440 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:15.440 "subsystems": [ 00:34:15.440 { 00:34:15.440 "subsystem": "bdev", 00:34:15.440 "config": [ 00:34:15.440 { 00:34:15.440 "method": "bdev_nvme_attach_controller", 00:34:15.440 "params": { 00:34:15.440 "trtype": "tcp", 00:34:15.440 "adrfam": "IPv4", 00:34:15.440 "name": "Nvme0", 00:34:15.440 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:15.440 "traddr": "10.0.0.2", 00:34:15.440 "trsvcid": "4420" 00:34:15.440 } 00:34:15.440 }, 00:34:15.440 { 00:34:15.440 "method": "bdev_set_options", 00:34:15.440 "params": { 00:34:15.440 "bdev_auto_examine": false 00:34:15.440 } 00:34:15.440 } 00:34:15.440 ] 00:34:15.440 } 00:34:15.440 ] 00:34:15.440 }' 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:15.440 "subsystems": [ 00:34:15.440 { 00:34:15.440 "subsystem": "bdev", 00:34:15.440 "config": [ 00:34:15.440 { 00:34:15.440 "method": "bdev_nvme_attach_controller", 00:34:15.440 "params": { 00:34:15.440 "trtype": "tcp", 00:34:15.440 "adrfam": "IPv4", 00:34:15.440 "name": "Nvme0", 00:34:15.440 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:15.440 "traddr": "10.0.0.2", 00:34:15.440 "trsvcid": "4420" 00:34:15.440 } 00:34:15.440 }, 00:34:15.440 { 00:34:15.440 "method": "bdev_set_options", 00:34:15.440 "params": { 00:34:15.440 "bdev_auto_examine": false 00:34:15.440 } 00:34:15.440 } 00:34:15.440 ] 00:34:15.440 } 00:34:15.440 ] 00:34:15.440 }' 00:34:15.440 13:52:54 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:15.698 [2024-07-15 13:52:54.888078] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:15.698 [2024-07-15 13:52:54.888147] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279595 ] 00:34:15.698 [2024-07-15 13:52:55.016411] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:15.698 [2024-07-15 13:52:55.112655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:16.215  Copying: 64/64 [kB] (average 20 MBps) 00:34:16.215 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@106 -- # update_stats 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:16.215 13:52:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:16.215 13:52:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:16.473 13:52:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:16.473 13:52:55 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:16.473 13:52:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:16.473 13:52:55 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.pA93aYHQS6 --ob Nvme0n1 --bs 4096 --count 16 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@25 -- # local config 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:16.473 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:16.473 "subsystems": [ 00:34:16.473 { 00:34:16.473 "subsystem": "bdev", 00:34:16.473 "config": [ 00:34:16.473 { 00:34:16.473 "method": "bdev_nvme_attach_controller", 00:34:16.473 "params": { 00:34:16.473 "trtype": "tcp", 00:34:16.473 "adrfam": "IPv4", 00:34:16.473 "name": "Nvme0", 00:34:16.473 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:16.473 "traddr": "10.0.0.2", 00:34:16.473 "trsvcid": "4420" 00:34:16.473 } 00:34:16.473 }, 00:34:16.473 { 00:34:16.473 "method": "bdev_set_options", 00:34:16.473 "params": { 00:34:16.473 "bdev_auto_examine": false 00:34:16.473 } 00:34:16.473 } 00:34:16.473 ] 00:34:16.473 } 00:34:16.473 ] 00:34:16.473 }' 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.pA93aYHQS6 --ob Nvme0n1 --bs 4096 --count 16 00:34:16.473 13:52:55 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:16.473 "subsystems": [ 00:34:16.473 { 00:34:16.473 "subsystem": "bdev", 00:34:16.473 "config": [ 00:34:16.473 { 00:34:16.473 "method": "bdev_nvme_attach_controller", 00:34:16.473 "params": { 00:34:16.473 "trtype": "tcp", 00:34:16.473 "adrfam": "IPv4", 00:34:16.473 "name": "Nvme0", 00:34:16.473 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:16.473 "traddr": "10.0.0.2", 00:34:16.473 "trsvcid": "4420" 00:34:16.473 } 00:34:16.473 }, 00:34:16.473 { 00:34:16.473 "method": "bdev_set_options", 00:34:16.473 "params": { 00:34:16.473 "bdev_auto_examine": false 00:34:16.473 } 00:34:16.473 } 00:34:16.473 ] 00:34:16.473 } 00:34:16.473 ] 00:34:16.473 }' 00:34:16.473 [2024-07-15 13:52:55.833548] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:16.473 [2024-07-15 13:52:55.833604] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279631 ] 00:34:16.731 [2024-07-15 13:52:55.945433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:16.731 [2024-07-15 13:52:56.044207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:17.261  Copying: 64/64 [kB] (average 12 MBps) 00:34:17.261 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@114 -- # update_stats 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.262 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.262 13:52:56 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:17.520 13:52:56 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@117 -- # : 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.uqOAKm7De5 --ib Nvme0n1 --bs 4096 --count 16 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@25 -- # local config 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:17.520 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:17.520 "subsystems": [ 00:34:17.520 { 00:34:17.520 "subsystem": "bdev", 00:34:17.520 "config": [ 00:34:17.520 { 00:34:17.520 "method": "bdev_nvme_attach_controller", 00:34:17.520 "params": { 00:34:17.520 "trtype": "tcp", 00:34:17.520 "adrfam": "IPv4", 00:34:17.520 "name": "Nvme0", 00:34:17.520 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:17.520 "traddr": "10.0.0.2", 00:34:17.520 "trsvcid": "4420" 00:34:17.520 } 00:34:17.520 }, 00:34:17.520 { 00:34:17.520 "method": "bdev_set_options", 00:34:17.520 "params": { 00:34:17.520 "bdev_auto_examine": false 00:34:17.520 } 00:34:17.520 } 00:34:17.520 ] 00:34:17.520 } 00:34:17.520 ] 00:34:17.520 }' 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.uqOAKm7De5 --ib Nvme0n1 --bs 4096 --count 16 00:34:17.520 13:52:56 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:17.520 "subsystems": [ 00:34:17.520 { 00:34:17.520 "subsystem": "bdev", 00:34:17.521 "config": [ 00:34:17.521 { 00:34:17.521 "method": "bdev_nvme_attach_controller", 00:34:17.521 "params": { 00:34:17.521 "trtype": "tcp", 00:34:17.521 "adrfam": "IPv4", 00:34:17.521 "name": "Nvme0", 00:34:17.521 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:17.521 "traddr": "10.0.0.2", 00:34:17.521 "trsvcid": "4420" 00:34:17.521 } 00:34:17.521 }, 00:34:17.521 { 00:34:17.521 "method": "bdev_set_options", 00:34:17.521 "params": { 00:34:17.521 "bdev_auto_examine": false 00:34:17.521 } 00:34:17.521 } 00:34:17.521 ] 00:34:17.521 } 00:34:17.521 ] 00:34:17.521 }' 00:34:17.778 [2024-07-15 13:52:56.960397] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:17.778 [2024-07-15 13:52:56.960452] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2279841 ] 00:34:17.778 [2024-07-15 13:52:57.072566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:17.778 [2024-07-15 13:52:57.168664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:18.293  Copying: 64/64 [kB] (average 1361 kBps) 00:34:18.293 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:18.293 13:52:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:18.293 13:52:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.pA93aYHQS6 /tmp/tmp.uqOAKm7De5 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.pA93aYHQS6 /tmp/tmp.uqOAKm7De5 00:34:18.551 13:52:57 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@117 -- # sync 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@120 -- # set +e 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:18.551 rmmod nvme_tcp 00:34:18.551 rmmod nvme_fabrics 00:34:18.551 rmmod nvme_keyring 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@124 -- # set -e 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@125 -- # return 0 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@489 -- # '[' -n 2278968 ']' 00:34:18.551 13:52:57 chaining -- nvmf/common.sh@490 -- # killprocess 2278968 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@948 -- # '[' -z 2278968 ']' 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@952 -- # kill -0 2278968 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@953 -- # uname 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2278968 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2278968' 00:34:18.551 killing process with pid 2278968 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@967 -- # kill 2278968 00:34:18.551 13:52:57 chaining -- common/autotest_common.sh@972 -- # wait 2278968 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:18.809 13:52:58 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:18.809 13:52:58 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:18.809 13:52:58 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:19.067 13:52:58 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:34:19.067 13:52:58 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:19.067 13:52:58 chaining -- bdev/chaining.sh@132 -- # bperfpid=2280048 00:34:19.067 13:52:58 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:19.067 13:52:58 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2280048 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@829 -- # '[' -z 2280048 ']' 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:19.067 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:19.067 13:52:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:19.067 [2024-07-15 13:52:58.318479] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:19.067 [2024-07-15 13:52:58.318554] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2280048 ] 00:34:19.067 [2024-07-15 13:52:58.449771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:19.324 [2024-07-15 13:52:58.548463] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:19.890 13:52:59 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:19.890 13:52:59 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:19.890 13:52:59 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:34:19.890 13:52:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:19.890 13:52:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:19.890 malloc0 00:34:19.890 true 00:34:19.890 true 00:34:19.890 [2024-07-15 13:52:59.252445] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:19.891 crypto0 00:34:19.891 [2024-07-15 13:52:59.260473] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:19.891 crypto1 00:34:19.891 13:52:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:19.891 13:52:59 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:20.149 Running I/O for 5 seconds... 00:34:25.420 00:34:25.420 Latency(us) 00:34:25.420 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:25.420 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:25.420 Verification LBA range: start 0x0 length 0x2000 00:34:25.420 crypto1 : 5.01 11460.84 44.77 0.00 0.00 22267.84 2308.01 14189.97 00:34:25.420 =================================================================================================================== 00:34:25.420 Total : 11460.84 44.77 0.00 0.00 22267.84 2308.01 14189.97 00:34:25.420 0 00:34:25.420 13:53:04 chaining -- bdev/chaining.sh@146 -- # killprocess 2280048 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@948 -- # '[' -z 2280048 ']' 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@952 -- # kill -0 2280048 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@953 -- # uname 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2280048 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2280048' 00:34:25.420 killing process with pid 2280048 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@967 -- # kill 2280048 00:34:25.420 Received shutdown signal, test time was about 5.000000 seconds 00:34:25.420 00:34:25.420 Latency(us) 00:34:25.420 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:25.420 =================================================================================================================== 00:34:25.420 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@972 -- # wait 2280048 00:34:25.420 13:53:04 chaining -- bdev/chaining.sh@152 -- # bperfpid=2280922 00:34:25.420 13:53:04 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:25.420 13:53:04 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2280922 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@829 -- # '[' -z 2280922 ']' 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:25.420 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:25.420 13:53:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:25.420 [2024-07-15 13:53:04.739252] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:25.420 [2024-07-15 13:53:04.739322] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2280922 ] 00:34:25.679 [2024-07-15 13:53:04.868124] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:25.679 [2024-07-15 13:53:04.973407] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:26.246 13:53:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:26.246 13:53:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:26.246 13:53:05 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:34:26.246 13:53:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:26.246 13:53:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:26.504 malloc0 00:34:26.504 true 00:34:26.504 true 00:34:26.504 [2024-07-15 13:53:05.823752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:34:26.504 [2024-07-15 13:53:05.823802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:26.504 [2024-07-15 13:53:05.823824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d51730 00:34:26.504 [2024-07-15 13:53:05.823843] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:26.504 [2024-07-15 13:53:05.824938] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:26.504 [2024-07-15 13:53:05.824962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:34:26.504 pt0 00:34:26.504 [2024-07-15 13:53:05.831785] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:26.504 crypto0 00:34:26.504 [2024-07-15 13:53:05.839803] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:26.504 crypto1 00:34:26.504 13:53:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:26.504 13:53:05 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:26.763 Running I/O for 5 seconds... 00:34:32.031 00:34:32.031 Latency(us) 00:34:32.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:32.031 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:32.031 Verification LBA range: start 0x0 length 0x2000 00:34:32.031 crypto1 : 5.01 9097.23 35.54 0.00 0.00 28058.39 826.32 16868.40 00:34:32.031 =================================================================================================================== 00:34:32.031 Total : 9097.23 35.54 0.00 0.00 28058.39 826.32 16868.40 00:34:32.031 0 00:34:32.031 13:53:10 chaining -- bdev/chaining.sh@167 -- # killprocess 2280922 00:34:32.031 13:53:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 2280922 ']' 00:34:32.031 13:53:10 chaining -- common/autotest_common.sh@952 -- # kill -0 2280922 00:34:32.031 13:53:10 chaining -- common/autotest_common.sh@953 -- # uname 00:34:32.031 13:53:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:32.031 13:53:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2280922 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2280922' 00:34:32.031 killing process with pid 2280922 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@967 -- # kill 2280922 00:34:32.031 Received shutdown signal, test time was about 5.000000 seconds 00:34:32.031 00:34:32.031 Latency(us) 00:34:32.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:32.031 =================================================================================================================== 00:34:32.031 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@972 -- # wait 2280922 00:34:32.031 13:53:11 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:34:32.031 13:53:11 chaining -- bdev/chaining.sh@170 -- # killprocess 2280922 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@948 -- # '[' -z 2280922 ']' 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@952 -- # kill -0 2280922 00:34:32.031 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2280922) - No such process 00:34:32.031 13:53:11 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2280922 is not found' 00:34:32.031 Process with pid 2280922 is not found 00:34:32.031 13:53:11 chaining -- bdev/chaining.sh@171 -- # wait 2280922 00:34:32.031 13:53:11 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:34:32.031 13:53:11 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:32.032 13:53:11 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:32.032 13:53:11 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:32.032 13:53:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@336 -- # return 1 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:32.032 WARNING: No supported devices were found, fallback requested for tcp test 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:32.032 Cannot find device "nvmf_tgt_br" 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@155 -- # true 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:32.032 Cannot find device "nvmf_tgt_br2" 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@156 -- # true 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:32.032 Cannot find device "nvmf_tgt_br" 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@158 -- # true 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:32.032 Cannot find device "nvmf_tgt_br2" 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@159 -- # true 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:32.032 13:53:11 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:32.333 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@162 -- # true 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:32.333 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@163 -- # true 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:32.333 13:53:11 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:32.334 13:53:11 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:32.592 13:53:11 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:32.592 13:53:11 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:32.850 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:32.850 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.109 ms 00:34:32.850 00:34:32.850 --- 10.0.0.2 ping statistics --- 00:34:32.850 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:32.850 rtt min/avg/max/mdev = 0.109/0.109/0.109/0.000 ms 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:32.850 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:32.850 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.080 ms 00:34:32.850 00:34:32.850 --- 10.0.0.3 ping statistics --- 00:34:32.850 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:32.850 rtt min/avg/max/mdev = 0.080/0.080/0.080/0.000 ms 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:32.850 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:32.850 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.056 ms 00:34:32.850 00:34:32.850 --- 10.0.0.1 ping statistics --- 00:34:32.850 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:32.850 rtt min/avg/max/mdev = 0.056/0.056/0.056/0.000 ms 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@433 -- # return 0 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:32.850 13:53:12 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:32.851 13:53:12 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@481 -- # nvmfpid=2282071 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@482 -- # waitforlisten 2282071 00:34:32.851 13:53:12 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@829 -- # '[' -z 2282071 ']' 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:32.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:32.851 13:53:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:32.851 [2024-07-15 13:53:12.179167] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:32.851 [2024-07-15 13:53:12.179218] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:33.110 [2024-07-15 13:53:12.284741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:33.110 [2024-07-15 13:53:12.386132] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:33.110 [2024-07-15 13:53:12.386180] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:33.110 [2024-07-15 13:53:12.386195] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:33.110 [2024-07-15 13:53:12.386208] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:33.110 [2024-07-15 13:53:12.386220] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:33.110 [2024-07-15 13:53:12.386249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:34.045 13:53:13 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:34.045 13:53:13 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:34.045 13:53:13 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:34.045 malloc0 00:34:34.045 [2024-07-15 13:53:13.179116] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:34.045 [2024-07-15 13:53:13.195311] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.045 13:53:13 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:34:34.045 13:53:13 chaining -- bdev/chaining.sh@189 -- # bperfpid=2282259 00:34:34.045 13:53:13 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2282259 /var/tmp/bperf.sock 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@829 -- # '[' -z 2282259 ']' 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:34.045 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:34.045 13:53:13 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:34.045 13:53:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:34.045 [2024-07-15 13:53:13.268052] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:34.045 [2024-07-15 13:53:13.268113] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2282259 ] 00:34:34.045 [2024-07-15 13:53:13.396015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:34.303 [2024-07-15 13:53:13.497837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:35.236 13:53:14 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:35.236 13:53:14 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:35.236 13:53:14 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:34:35.236 13:53:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:35.494 [2024-07-15 13:53:14.869365] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:35.494 nvme0n1 00:34:35.494 true 00:34:35.494 crypto0 00:34:35.494 13:53:14 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:35.752 Running I/O for 5 seconds... 00:34:41.019 00:34:41.019 Latency(us) 00:34:41.019 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:41.019 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:41.019 Verification LBA range: start 0x0 length 0x2000 00:34:41.019 crypto0 : 5.02 8234.27 32.17 0.00 0.00 30990.14 5499.33 25188.62 00:34:41.019 =================================================================================================================== 00:34:41.019 Total : 8234.27 32.17 0.00 0.00 30990.14 5499.33 25188.62 00:34:41.019 0 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@205 -- # sequence=82698 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:41.019 13:53:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@206 -- # encrypt=41349 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:41.278 13:53:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@207 -- # decrypt=41349 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:41.536 13:53:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:41.537 13:53:20 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:41.537 13:53:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:41.537 13:53:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:41.537 13:53:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:41.796 13:53:21 chaining -- bdev/chaining.sh@208 -- # crc32c=82698 00:34:41.796 13:53:21 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:34:41.796 13:53:21 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:34:41.796 13:53:21 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:34:41.796 13:53:21 chaining -- bdev/chaining.sh@214 -- # killprocess 2282259 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@948 -- # '[' -z 2282259 ']' 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@952 -- # kill -0 2282259 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@953 -- # uname 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2282259 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2282259' 00:34:41.796 killing process with pid 2282259 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@967 -- # kill 2282259 00:34:41.796 Received shutdown signal, test time was about 5.000000 seconds 00:34:41.796 00:34:41.796 Latency(us) 00:34:41.796 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:41.796 =================================================================================================================== 00:34:41.796 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:41.796 13:53:21 chaining -- common/autotest_common.sh@972 -- # wait 2282259 00:34:42.053 13:53:21 chaining -- bdev/chaining.sh@219 -- # bperfpid=2283326 00:34:42.053 13:53:21 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:34:42.053 13:53:21 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2283326 /var/tmp/bperf.sock 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@829 -- # '[' -z 2283326 ']' 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:42.053 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:42.053 13:53:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:42.053 [2024-07-15 13:53:21.413669] Starting SPDK v24.09-pre git sha1 e7cce062d / DPDK 24.03.0 initialization... 00:34:42.053 [2024-07-15 13:53:21.413737] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2283326 ] 00:34:42.312 [2024-07-15 13:53:21.543799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:42.312 [2024-07-15 13:53:21.641282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:42.878 13:53:22 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:42.878 13:53:22 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:42.878 13:53:22 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:34:42.878 13:53:22 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:43.446 [2024-07-15 13:53:22.674905] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:43.446 nvme0n1 00:34:43.446 true 00:34:43.446 crypto0 00:34:43.446 13:53:22 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:43.446 Running I/O for 5 seconds... 00:34:48.715 00:34:48.715 Latency(us) 00:34:48.715 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:48.715 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:34:48.715 Verification LBA range: start 0x0 length 0x200 00:34:48.715 crypto0 : 5.01 1684.33 105.27 0.00 0.00 18620.95 1467.44 18919.96 00:34:48.715 =================================================================================================================== 00:34:48.715 Total : 1684.33 105.27 0.00 0.00 18620.95 1467.44 18919.96 00:34:48.715 0 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:48.715 13:53:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@233 -- # sequence=16866 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:48.715 13:53:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@234 -- # encrypt=8433 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:48.974 13:53:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@235 -- # decrypt=8433 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:49.232 13:53:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:49.491 13:53:28 chaining -- bdev/chaining.sh@236 -- # crc32c=16866 00:34:49.491 13:53:28 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:34:49.491 13:53:28 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:34:49.491 13:53:28 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:34:49.491 13:53:28 chaining -- bdev/chaining.sh@242 -- # killprocess 2283326 00:34:49.491 13:53:28 chaining -- common/autotest_common.sh@948 -- # '[' -z 2283326 ']' 00:34:49.491 13:53:28 chaining -- common/autotest_common.sh@952 -- # kill -0 2283326 00:34:49.491 13:53:28 chaining -- common/autotest_common.sh@953 -- # uname 00:34:49.491 13:53:28 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:49.491 13:53:28 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2283326 00:34:49.749 13:53:28 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:49.749 13:53:28 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:49.749 13:53:28 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2283326' 00:34:49.749 killing process with pid 2283326 00:34:49.749 13:53:28 chaining -- common/autotest_common.sh@967 -- # kill 2283326 00:34:49.749 Received shutdown signal, test time was about 5.000000 seconds 00:34:49.749 00:34:49.749 Latency(us) 00:34:49.749 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:49.749 =================================================================================================================== 00:34:49.749 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:49.749 13:53:28 chaining -- common/autotest_common.sh@972 -- # wait 2283326 00:34:49.749 13:53:29 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@117 -- # sync 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@120 -- # set +e 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:49.749 13:53:29 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:50.008 rmmod nvme_tcp 00:34:50.008 rmmod nvme_fabrics 00:34:50.008 rmmod nvme_keyring 00:34:50.008 13:53:29 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:50.008 13:53:29 chaining -- nvmf/common.sh@124 -- # set -e 00:34:50.008 13:53:29 chaining -- nvmf/common.sh@125 -- # return 0 00:34:50.008 13:53:29 chaining -- nvmf/common.sh@489 -- # '[' -n 2282071 ']' 00:34:50.008 13:53:29 chaining -- nvmf/common.sh@490 -- # killprocess 2282071 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@948 -- # '[' -z 2282071 ']' 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@952 -- # kill -0 2282071 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@953 -- # uname 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2282071 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2282071' 00:34:50.008 killing process with pid 2282071 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@967 -- # kill 2282071 00:34:50.008 13:53:29 chaining -- common/autotest_common.sh@972 -- # wait 2282071 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:50.266 13:53:29 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:50.266 13:53:29 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:50.266 13:53:29 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:34:50.266 13:53:29 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:34:50.266 00:34:50.266 real 0m46.101s 00:34:50.266 user 1m0.240s 00:34:50.266 sys 0m13.036s 00:34:50.266 13:53:29 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:50.266 13:53:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:50.266 ************************************ 00:34:50.266 END TEST chaining 00:34:50.266 ************************************ 00:34:50.266 13:53:29 -- common/autotest_common.sh@1142 -- # return 0 00:34:50.266 13:53:29 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:34:50.266 13:53:29 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:34:50.266 13:53:29 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:34:50.266 13:53:29 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:34:50.266 13:53:29 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:34:50.266 13:53:29 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:34:50.266 13:53:29 -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:50.266 13:53:29 -- common/autotest_common.sh@10 -- # set +x 00:34:50.266 13:53:29 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:34:50.266 13:53:29 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:34:50.266 13:53:29 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:34:50.266 13:53:29 -- common/autotest_common.sh@10 -- # set +x 00:34:55.534 INFO: APP EXITING 00:34:55.534 INFO: killing all VMs 00:34:55.534 INFO: killing vhost app 00:34:55.534 INFO: EXIT DONE 00:34:58.819 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:34:58.819 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:34:58.819 Waiting for block devices as requested 00:34:58.819 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:34:58.819 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:34:58.819 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:34:58.819 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:34:58.819 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:34:58.819 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:34:59.078 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:34:59.078 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:34:59.337 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:34:59.337 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:34:59.337 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:34:59.638 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:34:59.638 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:34:59.638 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:34:59.638 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:34:59.920 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:34:59.920 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:04.108 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:04.108 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:04.108 Cleaning 00:35:04.108 Removing: /var/run/dpdk/spdk0/config 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:35:04.108 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:35:04.108 Removing: /var/run/dpdk/spdk0/hugepage_info 00:35:04.108 Removing: /dev/shm/nvmf_trace.0 00:35:04.108 Removing: /dev/shm/spdk_tgt_trace.pid2026677 00:35:04.108 Removing: /var/run/dpdk/spdk0 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2025813 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2026677 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2027211 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2027942 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2028134 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2028897 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2029069 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2029351 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2031968 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2033317 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2033548 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2033932 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2034191 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2034432 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2034628 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2034832 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2035123 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2035804 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2038496 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2038700 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2038932 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2039207 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2039341 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2039475 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2039763 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2039958 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2040161 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2040359 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2040555 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2040846 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2041108 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2041308 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2041501 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2041701 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2041950 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2042257 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2042456 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2042652 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2042851 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2043051 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2043335 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2043608 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2043805 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2044003 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2044295 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2044574 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2044953 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2045290 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2045522 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2045895 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2046259 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2046494 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2046695 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2047010 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2047419 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2047796 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2047990 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2052463 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2054171 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2055858 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2056755 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2057829 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2058144 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2058223 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2058250 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2062049 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2062599 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2063584 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2063845 00:35:04.108 Removing: /var/run/dpdk/spdk_pid2069186 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2070820 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2071790 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2076165 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2078173 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2079142 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2083050 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2085482 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2086448 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2096196 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2098255 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2099225 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2109474 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2111529 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2112595 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2122479 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2125693 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2126761 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2138107 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2140547 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2141788 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2152962 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2155576 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2156891 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2168305 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2172003 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2173148 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2174131 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2177103 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2182222 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2184860 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2189730 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2193016 00:35:04.109 Removing: /var/run/dpdk/spdk_pid2198344 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2201037 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2207683 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2209948 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2216574 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2219003 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2225116 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2227397 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2231553 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2231912 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2232264 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2232620 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2233073 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2233832 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2234661 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2235083 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2236759 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2238818 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2240418 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2241882 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2243483 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2245082 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2246680 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2247980 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2248523 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2248927 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2251059 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2252923 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2254761 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2255823 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2256924 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2257535 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2257628 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2257708 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2258064 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2258196 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2259331 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2260890 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2262895 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2263730 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2264452 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2264684 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2264835 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2264862 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2265798 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2266365 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2266776 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2268876 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2270608 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2272428 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2273482 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2274719 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2275258 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2275280 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2279191 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2279402 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2279595 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2279631 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2279841 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2280048 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2280922 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2282259 00:35:04.367 Removing: /var/run/dpdk/spdk_pid2283326 00:35:04.367 Clean 00:35:04.626 13:53:43 -- common/autotest_common.sh@1451 -- # return 0 00:35:04.626 13:53:43 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:35:04.626 13:53:43 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:04.626 13:53:43 -- common/autotest_common.sh@10 -- # set +x 00:35:04.626 13:53:43 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:35:04.626 13:53:43 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:04.626 13:53:43 -- common/autotest_common.sh@10 -- # set +x 00:35:04.626 13:53:44 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:04.626 13:53:44 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:35:04.626 13:53:44 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:35:04.626 13:53:44 -- spdk/autotest.sh@391 -- # hash lcov 00:35:04.626 13:53:44 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:35:04.626 13:53:44 -- spdk/autotest.sh@393 -- # hostname 00:35:04.626 13:53:44 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:35:04.885 geninfo: WARNING: invalid characters removed from testname! 00:35:36.979 13:54:11 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:36.980 13:54:14 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:38.355 13:54:17 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:40.890 13:54:20 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:43.438 13:54:22 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:45.976 13:54:25 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:49.264 13:54:27 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:35:49.264 13:54:28 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:49.264 13:54:28 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:35:49.264 13:54:28 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:49.264 13:54:28 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:49.264 13:54:28 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:49.264 13:54:28 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:49.264 13:54:28 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:49.264 13:54:28 -- paths/export.sh@5 -- $ export PATH 00:35:49.264 13:54:28 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:49.264 13:54:28 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:49.264 13:54:28 -- common/autobuild_common.sh@444 -- $ date +%s 00:35:49.264 13:54:28 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721044468.XXXXXX 00:35:49.264 13:54:28 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721044468.2gIdRQ 00:35:49.264 13:54:28 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:35:49.264 13:54:28 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:35:49.264 13:54:28 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:35:49.264 13:54:28 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:35:49.264 13:54:28 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:35:49.264 13:54:28 -- common/autobuild_common.sh@460 -- $ get_config_params 00:35:49.264 13:54:28 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:35:49.264 13:54:28 -- common/autotest_common.sh@10 -- $ set +x 00:35:49.264 13:54:28 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:35:49.264 13:54:28 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:35:49.264 13:54:28 -- pm/common@17 -- $ local monitor 00:35:49.264 13:54:28 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.264 13:54:28 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.264 13:54:28 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.264 13:54:28 -- pm/common@21 -- $ date +%s 00:35:49.264 13:54:28 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.264 13:54:28 -- pm/common@21 -- $ date +%s 00:35:49.264 13:54:28 -- pm/common@25 -- $ sleep 1 00:35:49.264 13:54:28 -- pm/common@21 -- $ date +%s 00:35:49.264 13:54:28 -- pm/common@21 -- $ date +%s 00:35:49.264 13:54:28 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721044468 00:35:49.264 13:54:28 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721044468 00:35:49.264 13:54:28 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721044468 00:35:49.264 13:54:28 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721044468 00:35:49.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721044468_collect-vmstat.pm.log 00:35:49.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721044468_collect-cpu-temp.pm.log 00:35:49.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721044468_collect-cpu-load.pm.log 00:35:49.264 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721044468_collect-bmc-pm.bmc.pm.log 00:35:49.832 13:54:29 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:35:49.832 13:54:29 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:35:49.832 13:54:29 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:49.832 13:54:29 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:35:49.832 13:54:29 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:35:49.832 13:54:29 -- spdk/autopackage.sh@19 -- $ timing_finish 00:35:49.832 13:54:29 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:35:49.832 13:54:29 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:35:49.832 13:54:29 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:49.832 13:54:29 -- spdk/autopackage.sh@20 -- $ exit 0 00:35:49.832 13:54:29 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:35:49.832 13:54:29 -- pm/common@29 -- $ signal_monitor_resources TERM 00:35:49.832 13:54:29 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:35:49.832 13:54:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.832 13:54:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:35:49.832 13:54:29 -- pm/common@44 -- $ pid=2294454 00:35:49.832 13:54:29 -- pm/common@50 -- $ kill -TERM 2294454 00:35:49.832 13:54:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.832 13:54:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:35:49.832 13:54:29 -- pm/common@44 -- $ pid=2294456 00:35:49.832 13:54:29 -- pm/common@50 -- $ kill -TERM 2294456 00:35:49.832 13:54:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.832 13:54:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:35:49.832 13:54:29 -- pm/common@44 -- $ pid=2294457 00:35:49.832 13:54:29 -- pm/common@50 -- $ kill -TERM 2294457 00:35:49.832 13:54:29 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:49.832 13:54:29 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:35:49.832 13:54:29 -- pm/common@44 -- $ pid=2294483 00:35:49.832 13:54:29 -- pm/common@50 -- $ sudo -E kill -TERM 2294483 00:35:49.832 + [[ -n 1910805 ]] 00:35:49.832 + sudo kill 1910805 00:35:50.167 [Pipeline] } 00:35:50.187 [Pipeline] // stage 00:35:50.194 [Pipeline] } 00:35:50.215 [Pipeline] // timeout 00:35:50.221 [Pipeline] } 00:35:50.236 [Pipeline] // catchError 00:35:50.242 [Pipeline] } 00:35:50.260 [Pipeline] // wrap 00:35:50.266 [Pipeline] } 00:35:50.282 [Pipeline] // catchError 00:35:50.317 [Pipeline] stage 00:35:50.319 [Pipeline] { (Epilogue) 00:35:50.332 [Pipeline] catchError 00:35:50.334 [Pipeline] { 00:35:50.350 [Pipeline] echo 00:35:50.352 Cleanup processes 00:35:50.358 [Pipeline] sh 00:35:50.670 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:50.670 2294554 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:35:50.670 2294777 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:50.684 [Pipeline] sh 00:35:50.963 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:50.964 ++ grep -v 'sudo pgrep' 00:35:50.964 ++ awk '{print $1}' 00:35:50.964 + sudo kill -9 2294554 00:35:50.975 [Pipeline] sh 00:35:51.258 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:36:03.474 [Pipeline] sh 00:36:03.757 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:36:04.015 Artifacts sizes are good 00:36:04.025 [Pipeline] archiveArtifacts 00:36:04.031 Archiving artifacts 00:36:04.228 [Pipeline] sh 00:36:04.508 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:36:04.522 [Pipeline] cleanWs 00:36:04.531 [WS-CLEANUP] Deleting project workspace... 00:36:04.531 [WS-CLEANUP] Deferred wipeout is used... 00:36:04.537 [WS-CLEANUP] done 00:36:04.539 [Pipeline] } 00:36:04.556 [Pipeline] // catchError 00:36:04.568 [Pipeline] sh 00:36:04.844 + logger -p user.info -t JENKINS-CI 00:36:04.853 [Pipeline] } 00:36:04.869 [Pipeline] // stage 00:36:04.875 [Pipeline] } 00:36:04.893 [Pipeline] // node 00:36:04.902 [Pipeline] End of Pipeline 00:36:05.063 Finished: SUCCESS